Happy iPhone 6 day. If you're reading this you're proabably not standing in line hoping to get your hands on Apple's latest devices. My colleague Mike Facemire drove past the local Apple store in Back Bay last night at 1 A.M. on the way home from Logan airport and described the scene as "nuts". The line was completely around the block, in 40 degree weather no less.

Developers should pay attention, as there's more going on here than hipsters queuing for the latest shiny. Today Mike, Julie Ask, and yours truly published a research note for eBusiness professionals detailing the top ten ways to leverage Apple’s new tech. Central to our argument is that iOS 8 takes many steps to break down the barriers between custom 3rd party apps and Apple's mobile platform. Mobile developers used to be constrained to their own secure, sand-boxed containers with minimal access to sensors on the device and local storage, but separated from other custom apps. As a result, we saw development teams gradually move toward "least common denomiator" apps that saved money by using a common code base.

While these LCD apps might save money and time, they don't advance the platform ambitions of Apple, Google and Microsoft. In response, we're now seeing them shift to opening up more and more platform-unique services to developers, and breaking down the walls between apps and their mobile platforms. Apple is doing it with iOS 8 and soon Google with Android L. Because of this, we're seeing customer engagement opportunities shift to favor "micro-moments" – brief interactions where developers can get customers' attention — and anticipate their needs. Instead of customers intentionally using apps a few times a day, developers need to think about how they engage customers in 5-10 second interactions many times a day. As a result, development focus shifts to favor notifications, widgets, and cross-device interactions, all of which are better supported in iOS 8 with new APIs. In a sense, the golden age of the self contained app is over, but developers still need to adjust.

But there's an even more important reason that developers need to think about a micro-moment strategy: it's how they will support the ever expanding crop of wearables like theSamsung Gear Live I've been wearing for 3 months now or the Apple Watch that's set to ship in early 2015. And micro-moments are critically important as interactions shift to cars and heads-up devices like Google Glass. The reason? These devices must project a digital reality that meshes with physical reality instead of substituting for it. Think about it: you don't want a modal dialog box popping up in the middle of a windscreen or impeding vision on an eyeglass lens. You want a peripheral notification that can be quickly seen and dismissed or responded to, ideally without fumbling to type or click.

We're at an exciting (and scary) time for developers: Our digital world is quickly merging with the physcial world. Computer vision with platforms like the NVidia Jetson TK1 is within the reach of the mainstream. Check out Google's project Tango if you don't believe me. Voice recognition and virtual assistants like Cortana and Siri are going mainstream, and their voice interaction capabilities are improving steadily. I personally find I take my phone out of my pocket 50% less often than I used to, as the voice recognition capabilities of the Samsung Gear Live are good enough to respond to SMS messages, Google Hangout messages and even simple e-mails. And subtly checking out haptic alerts on the wrist while in a meeting is now a norm. The bottom line: we've now got acceptable input and output mechansisms for 3 out of 5 senses where developers can rapidly process physcial feedback and overlay digital information in a way that is natural and intuitive.

I used to think that I would be long retired before developers would have to deal with issues like how to simulate emotion in the vocies of digital assistants or the ethics of how to program a self driving car to avoid a crash. But I'm now reassesing that train of thought. The early days of digital-physical convergence are upon us, and development professionals need to think about the implications, and how this will change the way we build software. I'll be speaking about this at Forrester's Forum For Application Development and Delivery Professionals in Chicago on October 16th. Hope to see you there!