Build and learn
April 17, 2025

When code meets reality: Lessons from building in the physical world

How web developers can leverage their existing skills to build smarter, real-world systems.
Joyce Lin
Head of Developer Relations

As web developers, we get a lot of things for free: controlled environments, predictable inputs, and tight feedback loops. Testing, monitoring, and CI/CD pipelines help us catch and recover from failure quickly. But once code steps off the screen and into the physical world, you give up some control.

The good news? You don’t need to throw out your existing skillset. The foundations you already have in web development translate more than you might think.

At Viam, we build open-source tools that connect software to hardware. That means robotics, IoT, and edge computing. It’s fun, powerful…and deeply humbling. Because no matter how clean your code is, reality doesn’t always play along. Check out my lightning talk at DotJS about how Code in the Physical World can get a little messy.

Failure looks different in the physical world

Compared to software development environments, physical environments face different challenges that result in different consequences.

Unlike web apps, a failed update in physical systems can’t be fixed with a quick deploy. In the physical world, failure is a car that won’t start, a house without heat, or engineers with crowbars trying to get back into their own data centers.

And it’s not just human error:

Buckley the sea turtle being tracked with a satellite tag
Buckley the sea turtle being tracked with a satellite tag

So what can developers do?

We can’t count on eliminating failure altogether, but we can plan for it. Just like we design UIs for low-bandwidth conditions or fallback states, physical systems need graceful failure modes too.

  • Your phone doesn’t catch fire when it overheats, it throttles the CPU, dims the screen, and pauses charging. That’s a designed failure state.
  • NASA’s Voyager 1 is still running almost 50 years later without the ability to physically replace components. This is largely due to careful planning for failovers, redundancy, and managing tradeoffs.
  • Starlink satellites are designed to be safely decommissioned and burn up cleanly at end-of-life leaving no debris in space.

When you build for the physical world, failure is expected. The goal isn’t to prevent it, it’s to control how it happens.

Writing code that controls hardware

You shouldn’t need to write firmware or low-level code to build real-world systems. That’s where projects like TC53 come in, standardizing JavaScript APIs for physical devices.

Even moreso, you don’t need to write code that runs on the device, you can write code that controls the device remotely. If you’ve worked with REST APIs, event handlers, or browser-based tools, transitioning from purely digital interfaces to code that interacts with hardware is more familiar than you might expect.

  • WebRTC: With browser-native APIs like WebRTC, you can control a physical device from your browser. No code needs to be deployed to the device itself.

A diagram showing WebRTC connection flow between a browser and robot. The sequence shows: 1. Robot sends 'Ready' signal to cloud signaling server, 2. Browser sends 'Dialing My Robot' to the server, 3. Robot receives 'Caller Incoming' notification, 4. Browser receives 'My Robot Info', and 5. Direct peer-to-peer connection established between the robot (drawn as a simple cartoon robot with telephone) and browser (shown as a window with colorful logo). Arrows indicate the communication flow through the signaling cloud and the final direct connection.

  • Johnny-Five: This is a framework that lets you run Node.js on your laptop to talk to an Arduino board over serial like a remote control.
  • Viam: This is an open-source platform that abstracts away the finer details of integrating with drivers, networking, and security. So you can plug-and-play with modules to create your dream machines and then use the SDKs to build applications that control them.
Diagram showing Viam's architecture connecting physical devices with cloud services. The left 'Physical World' side shows sensors, cameras, and actuators linked to a computer with local storage and processing. The right 'Cloud' side contains configuration, AI models, and data analysis tools. The systems connect through application interfaces and data synchronization, with a workflow for team members to manage devices and perform fleet-wide configuration and troubleshooting.

The real lesson? Keep the logic where it's easy to test, debug, and update. In this way, we can leverage our existing skillset to control devices without needing to dive deep into low-level code.

Sign up

The shift toward embodied AI

We’ve had the fundamental technologies for quite a while - computer vision, robot vacuums, Mars rovers that execute pre-planned logic. So what’s different now to enable the wild growth in AI, beyond the world of software, and into the physical world? 

What’s changed? Three key things have made embodied AI possible:

  • Low-cost, high-performance compute
  • Tooling and ecosystem to support deploying models and managing fleets
  • Enough training data to go from rule-based execution to on-the-fly learning

That unlocks a huge shift to enable systems that sense, act, adapt, and recover, just like people.

And we’re not just shipping to the cloud anymore. We’re shipping to physical devices:

  • Phones
  • Wearables
  • Cars
  • Drones
  • Robots

The boundary between code and the real world is disappearing. And that’s why this moment is so exciting for web developers. Web developers have a unique advantage here. The shift to client-side AI and edge compute means your skills are in demand in this new landscape. 

Want to try it yourself?

Modern platforms like Viam make it possible to control real-world hardware using familiar tools like JavaScript, Python, and APIs. 

If you're curious about working with real-world hardware using modern software tools, check out the Viam docs. You don’t need to be an embedded systems engineer, just someone who understands how systems interact and wants to see it move something in the real world.

Originally published on dev.to: https://dev.to/joycejetson/when-code-meets-reality-lessons-from-building-in-the-physical-world-3pia

twitter iconfacebook iconlinkedin iconreddit icon

Find us at our next event

May 6, 2025
May 6, 2025
,
07:00-09:00 PM EST

Elastic New York Meetup

In Person
New York, NY
Monitor and automate the physical world with Elastic and Viam. Join us for a demonstration of gathering data from a fleet of sensors, visualizing it with Kibana, and creating alerting rules that trigger in real life.
Secure your spot
May 5, 2025
May 7, 2025
,

Shift Miami

In Person
Perez Art Museum 1103 Biscayne Blvd, Miami, FL
Interested in robotics, but don't know where to start? Meet Viam in Miami, where Adrienne Tacke will discuss how to get up and running, even if you're "just" a software developer.
Join Us
May 7, 2025
,

Deploying and scaling AI with hardware

Virtual
Curious how startups are using Viam to build smart, vision-enabled products, even on low-power hardware? Join Viam engineers for a live computer vision demo and Q&A.
Join Us
Jun 12, 2025
Jun 16, 2025
,

JS Nation

In Person
Amsterdam
WebRTC is most often associated with building video and text chat into browsers but this peer-to-peer technology can also be used to monitor and control machines from anywhere in the world! Join Nick Hehr to learn about industrial arms, DIY rovers, and dashboards of data in real time.
Register Now
Jun 23, 2025
Jun 25, 2025
,

Open Source Summit North America 2025

In Person
Denver, CO
Edge-based computer vision gives us real-time insights, but getting that data where it needs to go without high bandwidth, lag, or hardware strain is a big challenge. Learn how to build a fast, event-driven vision pipeline.
Learn More