One of the key innovations in everyAir was a simple video codec designed for low latency streaming. My goal was to ensure that, aside from network induced latency, the video encode and decode steps consumed no more than 8 milliseconds of processing in total (on 2010 PC/mobile hardware). This goal aimed to ensure that the decoding device never became more than a single video frame behind.
I've posted a short paper that details the design of this codec in my papers section.
On the iPhone 3GS, we roughly saw costs of 1 millisecond for full frame decode, and 1 millisecond for the PC side CPU encode. Note that it is notoriously difficult to give accurate benchmark results for video codecs (due to the variance in content, resolution, device horsepower, etc.). For reference, our encoding PCs were equipped with Intel Core 2 Duos @ 2.2 GHz.
In terms of bandwidth consumption, we were generally 2-4x worse than h.264. This was acceptable for this product, as it relied upon our custom codec only for LAN scenarios, where users could afford the higher bandwidth costs in exchange for significantly lower latencies.
Over WAN, network latency was usually dominant, so the product would revert to using h.264 to maintain bandwidth limits (though we ultimately planned to migrate away from h.264). For reference, using identical hardware, h.264 generally took around 18 milliseconds to decode (iPhone 3GS), and about 15 milliseconds to encode (Core 2 Duo @ 2.2 GHz).
Check out the video below to see the codec in action. In this video, an iPhone is overlaid on top of a monitor. Video content is presented on the monitor simultaneously while it is being wirelessly streamed to the iPhone. Although there is visible latency caused by the network, the two videos still appear to be reasonably in sync.