
Paper presented last month
In a paper presented last month, Hari Balakrishnan of from the Massachusetts Institute of Research suggests how a device's movements could be predicted by sampling data from built-in motion, positioning, and other sensors, bumping throughput by around 50%. Such bumps are not trivial. Telecoms companies could save billions by using existing spectrum holdings more efficiently, and increase consumer satisfaction should the contingency arise, especially when it comes to data-intensive processes like streaming video. The improvements could reduce corporate network spending during boosting the actual speed of home broadband networks. These at times appear slow because of a poor wireless link or rather than the connection from home to the service provider.
Mobile-network design enables relatively seamless data transfer on the go thanks to overlapping coverage areas arranged in a rough honeycomb of cells. As a user moves ever farther away from the sweet spots of best signal reception in a given cell toward the boundary with other cells, the mobile phone, tablet, laptop, or other gadget slows down-lower signal strength and signal quality reduces the rate of data transmission. When the signal becomes too weak or another, stronger station appears, the device switches connections.
The same speed
Devices attached to wireless networks do not all run at the same speed, nevertheless. The highest speeds are only possible in close proximity to a mobile or Wi-Fi base station and with little cross-talk from other networks or unrelated interferers. As a device recedes, both it and the base station negotiate a stepped-down series of slower speeds that allow communication at greater distances. As a result, some data packets sent prior to such negotiations are wasted and cannot be delivered, and additional packets are needed to ratchet up the speed.
Slower communications do not just jam up the device. They as well, anyway you look at it, take longer to transmit, reducing the bandwidth available to all devices connected to the same set of frequencies. The upshot is a slew of languorous individual connections, as then as a reduction in the channel's capacity to handle the maximum number of users. Handoffs between base stations in different cells add moreover inefficiencies, causing transmission judders for voice and video.
Dr Balakrishnan and his team have come up with a set of solutions to this logjam that look promising, if early experiments on the campus Wi-Fi network and tests using mobile devices are any guide. Among the techniques, Dr Balakrishnan is particularly fond of using smartphone and tablet sensors to provide hints as to a user's motion and direction. A modern smartphone may have an accelerometer, gyroscope, magnetometer, proximity detector, barometer, and GPS receiver. Mobile and Wi-Fi radios can as well be used to detect motion by gauging varying signal strengths of nearby transmissions.
The approximate heading
By determining the approximate heading, velocity and acceleration of a device, software developed by Dr Balakrishnan's group can predict and assign the optimum data rate to communicate with an attached network without all the fuss of negotiating slower or faster transmissions. Dr Balakrishnan says this is possible thanks to the overall effect of access to many sensors. Data from the compass, the gyro and the acceleration sensor can be pooled in ways that make the real-time positioning data more robust than data from any individual sensor, he explains.
One example of such pooling currently being tested involves combining barometric data from pressure sensors with signal strength from known Wi-Fi base stations within a building. This can pinpoint which floor a device are on, and whether they are moving up or down. Even information about what apps are active-e-mail as opposed to a VoIP (Voice over Internet Protocol) call, say-could help determine the best course of action for adjusting speed and base-station connections.
The innovation builds on previous work to update chips
The innovation builds on previous work to update chips and associated software so as to examine error rates for current transmission speeds. This would let a device work at 95-99% of its ideal data rate at any given time. However such research is after all a decade away. Dr Balakrishnan's solution, in the meantime, involves tweaking the controlling code that handles the rate adjustments and handoffs, dispensing with the need to fiddle with hardware or chips. It would, he claims, make it possible to reach 85-95% of the optimum rate, an improvement of as much as 50% compared with the current state of affairs.
In this blog, our correspondents report on the intersections between science, research, culture and policy. Follow Babbage on Twitter »
- · Rackspace debuts OpenStack cloud servers
- · America's broadband adoption challenges
- · EPAM Systems Leverages the Cloud to Enhance Its Global Delivery Model With Nimbula Director
- · Telcom & Data intros emergency VOIP phones
- · Lorton Data Announces Partnership with Krengeltech Through A-Qua⢠Integration into DocuMailer
