Computing is no longer just the domain of servers, desktops and notebooks. With all sorts of sensors embedded almost anywhere conceivable, it’s inevitable that computing is now everywhere to process these inputs, sometimes even in real time.
And that is what Intel’s CEO was driving through when he took center stage to deliver his keynote address at Intel’s Developer Forum 2015 in San Francisco’s Moscone West convention center.
In short, Intel’s idea to personalize computing is based on three key assumptions or guiding principles:-
We’ll share some key highlights and innovations that are taking place for each of these assumptions.
Sight, sound, touch and feel. The new age of compute has to be capable enough to process all of this sensory information in a reliable manner. With ever powerful compute power in processors and advancement in software and APIs, this is getting closer to reality.
Sound was the first sensory showcase that was chosen to be showcased and it was Wake-on-Voice feature developed by Intel and Microsoft. Intel summoned a demo PC on sleep to wake up and the response was so quick that we missed capturing that on video! However, we caught the follow-up action, so take a quick peek at Cortana’s fluidity:-
Obviously, some prerequisites are required such as a Windows 10 OS being one of them.
Audio enhancements don’t just end there though. A joint collaboration with Google, Intel's new audio subsystem was ironed out with Android's Lollipop OS (5.0) to produce a more natural audio experience through latency enhancements that work only on an Intel hardware platform. Here’s a quick demo:-
Why would this matter? Imagine you are relying on touch based controls to play virtual audio instruments where input and processing lag can kill the experience instantly. What if you wanted to improve your band’s upcoming performance and your mobile or tablet device is your weakest link to an otherwise smooth experiment? It is these details that Intel wants to improve through enhanced sensory experience.
Moving on from audio matters, the next area of advancement is in vision through Intel’s RealSense 3D camera. While the camera isn’t new, it has been featured in a whole array of projects, some of which are pretty noteworthy. At IDF 2015, Intel brought out a similar 6-inch smartphone that was first showcased at IDF 2015 Shenzhen where the phone featured a more compact Intel RealSense 3D camera module.
This time around, the interesting announcement was the collaboration with Google to advance mobile depth sensing capabilities from Google’s Project Tango to produce an Android smartphone developer kit. LG was once reported to bring about Project Tango to consumers, but would that effort get redirected for developers or will there be another hardware vendor? It is early days yet, so further details are sketchy at the moment Meanwhile, check out another live demo from the IDF 2015 keynote:-
Even more interesting is how Intel’s RealSense 3D camera technology will be incorporated into robots to serve in the hospitality industry. A Silicon Valley robot-maker, Savioke, raised funds (from Google) early last year to build a robot for the service industry. As early as August 2014, Starwood Hotel’s Aloft Hotel group began robot service at a few select hotels and since then, Savioke has been in the limelight as it’s among the first to revolutionalize the service industry. What’s it got to do with Intel?
At IDF 2015, Savioke showed off its Relay butler robot incorporating Intel’s RealSense camera while Intel CEO Brian Krzanich announced RealSense technology support for the Robotics Operating System (ROS) that power several robots and the robotics industry such as the Savioke Relay robots. These updated Relay robots with Intel RealSense camera technology will be available for deployment next year.
Further to that, Intel is opening up more opportunities for developers to create new depth sensing hardware and software. In addition to Windows and Android, developers will be able to use Intel RealSense technology with Mac OS X, ROS (as outlines above), Linux, Scratch, Unity, XSplit, OBS, Structure SDK, OSVR, Unreal Engine 4 (UE4) and Google’s Project Tango. These developer capabilities enable new industry solutions beyond the PC, extending to robotics, drones, vending machines, “magic mirrors” and many other new exciting devices to come.
Already, Razer is working with Intel to deliver a RealSense camera peripheral for gamers and is scheduled for the first quarter of 2016. Read more about it in this news piece.
To wrap it altogether, we shift over to PC gaming which generally stimulates multiple sensory inputs to immerse you in your virtual world. Combining ultra-realistic driving simulator iRacing with one of the world’s most advanced hardware simulators from VRX, Intel demonstrated a highly immersive gaming platform using Intel RealSense technology-based head tracking and running on the upcoming 6th Generation Intel Skylake processor. The head-tracking functionality was enabled by a plugin for opentrack, a tracking platform that supports over 500 gaming titles and now enables Intel RealSense Camera for users to navigate their driving and flight simulators with real-time head tracking capability. The plug-in will be made available for gamers later this year. Here’s a slice of that experience that was showcased at the keynote:-
Intel shared two niche shopping experiences that are currently in use selectively, but have the opportunity to re-invent the shopping industry. The first is the Memory Mirror by Memomi that once again utilizes Intel RealSense camera technology. This is far from your ‘standard’ shopping mirror.
Existing ‘futuristic’ implementations use cameras to image you onscreen and then allow you to overlay different clothing line-up from the database of the interfacing system. You then swipe in/out different clothing options to see what suits your face and form best. That’s really not effective shopping since you will ultimately have to try it on to make a firm decision.
Enter Memomi’s Memory Mirror, the word’s first true digital mirror which uses Intel RealSense camera technology and an Intel Core i7 processor with Iris graphics to power its processing. Here’s a brief video from they keynote relating the in-shop experience with this product:-
With this mirror, you no longer have to wonder how your side or rear profile might look like in that new dress your eyeing on. Sure you can strain your neck to catch a glimpse with a conventional mirror, or simply capture a 360-degree view in front of the Memory Mirror and then view yourself comfortably. You can even save your try-ons and compare among a variety of clothing to select the one that you like best to splurge upon. No more having to remember how clothing number-1 and number-10 fit you. If a particular garment comes in multiple color options, simply use an on-screen color selector to instantly see which suits you instead of multiple try-outs, thereby accelerating your shopping experience. Finally, you can even save a profile to remember your store experiences to retrieve them in the future for comparison or even share your shopping experience to others – the choice is yours. In a few words, this is garment shopping 2.0. Check out more of the experience in the company’s promotional video.
The other ‘shopping’ experience isn’t anywhere as glamorous, but considering that vending machines are actually everywhere, the market share for a smart, connected and modern vending machine is absolutely huge. Here’s a demo of one such new fangled vending machine produced by N&W that uses Intel hardware:-
It goes without saying that the wearables space is an exciting and evolving industry where technology is merging with daily utility needs. Untested and unproven industry, its potential is really only limited to the creativity and feasibility of the next big thing that will send billions scrambling to own a piece of the action, just like what smartphones have done for the communications industry. Sci-fi flicks have shown us the potential and it only needs time to evolve with along with advances in processing and battery capacity to make these devices your essential companion.
On stage with Intel’s CEO at the IDF 2015 keynote session, we were pleasantly surprised to briefly see Fossil Group’s EVP and Chief Strategy and Marketing Officer, Greg Mckelvey come up to announce their entry into the smart watch segment.
Supposedly it will use Intel’s hardware platform (though no details were mentioned) but it was shown to use Android Wear platform where many other competitors are entrenched. It will be interesting to see high-end traditional timepiece companies jostle in this segment alongside with mainstream electronics brands.
Wearables aren't just fancy personal extension devices, it can even extend into enterprise-grade wearables either for secured access or ease of connectivity to your authorize computing device, printer, etc. We managed to get an example of such at IDF 2015, though we've to say that it's a very early interpretation of an enterprise wearable:-
That about sums up Intel’s vision of personalized computing and how it will permeate through our entire society. We’ll just end of this article as Intel’s CEO tries his second attempt of summoning his spider bot army to interact with him and get the audience moving:-