We’ve got smartphones, and we’re about to get smart watches. Cars are getting pretty smart, too. And TVs. And thermostats, freezers, lights, and medical devices. The Internet of Things is dawning.

Pretty soon, everything will have a tiny sensor in it. Everything will be remote-controlled via an app, or even self-controlled with the right programming. Even better, what if all of your smart things could talk to each other? Your bed could talk to your shower (when you roll off it, it’s time to get some warm water running), and your shower could talk to your coffee maker (so your beverage is brewed by the time you get dressed) and your coffee maker could talk to your car (when you put the mug back, it’s time to start defrosting the windows). This cohesive automated behavior is where the world is heading, but right now many trails have yet to be blazed.

The obvious first step is to embed the right kinds of sensors in your products. Do you need to be able to sense temperature? Humidity? Altitude?

The next step is to consider how a shower might possibly talk to a coffee maker. Particularly since you might make one of those products, but you probably don’t make both. How can you be sure that your smart-shower language translates to coffee-maker language?

Fortunately, there are people working on that. The Open Geospatial Consortium (OGC) has been hammering out protocols and standards for real-world items to use in communication with each other and with the human internet. As you start thinking about making your products intelligent, keep up with the standards that will prevent them from shouting to themselves in a language nothing else speaks.

By Sharon Campbell