Your smart phone could soon be making your commute much less stressful
By:
Last updated: Friday, 12 October 2018
Apps that can detect what mode of transport phone users are travelling on and automatically offer relevant advice are set to become a reality after extensive data-gathering research led by the University of Sussex.
Researchers at the University of Sussex’s Wearable Technologies Lab believe that the machine learning techniques developed in a global research competition they initiated could also lead to smart phones being able to predict upcoming road conditions and traffic levels, offer route or parking recommendations and even detect the food and drink consumed by a phone user while on the move.
Professor Daniel Roggen, a Reader in Sensor Technology at the University of Sussex, said: “This dataset is truly unique in its scale, the richness of the sensor data it comprises and the quality of its annotations. Previous studies generally collected only GPS and motion data.
"Our study is much wider in scope: we collected all sensor modalities of smartphones, and we collected the data with phones placed simultaneously at four locations where people typically carry their phones such as the hand, backpack, handbag and pocket. This is extremely important to design robust machine learning algorithms. The variety of transport modes, the range of conditions measured and the sheer number of sensors and hours of data recorded is unprecedented.”
Professor Roggen and his team collected the equivalent of more than 117 days’ worth of data monitoring aspects of commuters’ journeys in the UK using a variety of transport methods to create the largest publicly available dataset of its kind.
The project, whose findings will be presented at the Ubicomp conference in Singapore on Friday (12 October), gathered data from four mobile phones carried by researchers as they went about their daily commute over seven months.
The team launched a global competition challenging teams to develop the most accurate algorithms to recognize eight modes of transport (sitting still, walking, running, cycling or taking the bus, car, train or subway) from the data collected from 15 sensors measuring everything from movement to ambient pressure.
The project, supported by Chinese telecoms giant Huawei with academics at Ritsumeikan University and Kyushu Institute of Technology in Japan and Saints Cyril and Methodius University of Skopje in Macedonia, saw 17 teams take part, with two entries achieving results with more than 90% accuracy, eight with between 80% and 90%, and nine between 50% and 80%.
The winning team, JSI-Deep of the Jozef Stefan Institute in Slovenia, achieved the highest score of 93.9% through the use of a combination of deep and classical machine learning models. In general deep learning techniques tended to outperform traditional machine learning approaches, although not to any significant degree.
It is now hoped that the highly versatile University of Sussex-Huawei Locomotion-Transportation (SHL) dataset will be used for a wide range of studies into electronic logging devices exploring transportation mode recognition, mobility pattern mining, localization, tracking and sensor fusion.
Professor Roggen said: “By organising a machine learning competition with this dataset we can share experiences in the scientific community and set a baseline for future work. Automatically recognising modes of transportation is important to improve several mobile services – for example to ensure video streaming quality despite entering in tunnels or subways, or to proactively display information about connection schedules or traffic conditions.
"We believe other researchers will be able to leverage this unique dataset for many innovative studies and novel mobile applications beyond smart-transportation, for example to measure energy expenditure, detect social interaction and social isolation, or develop new low-power localisation techniques and better mobility models for mobile communication research.”
To read the paper in full visit here.
More information about the dataset is available at https://ieeexplore.ieee.org/document/8418369 and http://www.shl-dataset.org.