Leveraging Smartphone Cameras for Collaborative Road Advisories Submitted by R.S.Shenilton & M.Banu Chandhar
Ubiquitous smartphones are increasingly becoming the dominant platform for collaborative sensing.Smartphones, with their ever richer set of sensors, are being used to enable collaborative driver-assistance services like traffic advisory and road condition monitoring. To enable such services, the smartphones’ GPS, accelerometer and gyro sensors have been widely used. On the contrary, smartphone cameras, despite being very powerful sensors, have largely been neglected. In this paper, we introduce a collaborative sensing platform that exploits To demonstrate the potential of this platform, we propose several services that it can support, and prototype SignalGuru, a novel service that leverages windshield-mounted smartphones and their cameras to collaboratively detect and predict the schedule of traffic signals, enabling Green Light Optimal Speed Advisory (GLOSA) and other novel applications. Results from two deployments of SignalGuru, using iPhones in cars in Cambridge (MA, USA) and Singapore, show that traffic signal schedules can be predicted accurately. On average, SignalGuru comes within 0.66s, for pre-timed traffic signals and within 2.45s, for traffic-adaptive traffic signals. Feeding SignalGuru’s predicted traffic schedule to our GLOSA application, our vehicle fuel consumption measurements show savings of 20.3%, on average. Index Terms—smartphone, camera, Intelligent Transportation Systems, services, traffic signal, detection, filtering, prediction, collaboration INTRODUCTION
With an ever richer set of sensors, increased computational power and higher popularity, smartphones have become a major collaborative sensing platform. In particular, smartphones have been widely used to sense their environment and provide services to assist drivers. Several systems have been proposed that leverage smartphone GPS, accelerometer and gyroscope sensors to estimate traffic conditions detect road abnormalities and compute fuel-efficient routes. Cameras, in contrast to other smartphone sensors, have so far been underutilized for automated collaborative sensing. Cameras have been used only for a handful of participatory sensing systems; both image capture and image analysis are performed by a human user. Such applications include the monitoring of vegetation, garbage, and campus assets. Inall these services, users must point their smartphone camera to the target object, capture an image and upload it to the central service where a human operator will analyze it. The adoption of collaborative sensing services that leverage smartphone cameras without manual user and operator effort has so far been hindered because of two false beliefs: 1) the view of smartphone cameras is always obstructed (e.g., carried in pockets or placed flat on the table), and 2) image processing requirements are prohibitively high for resource-constrained mobile devices.In this paper, we propose a novel collaborative sensing platform that is based on the cameras of windshieldmounted smartphones. We show that accurate and near realtime camera-based sensing is possible. Many drivers are already placing their phones on the windshield in order to use existing popular services like navigation. Once a phone is placed on the windshield, its camera faces the road ahead. Our proposed sensing platform leverages these cameras to opportunistically capture content-rich images of the road and the environment ahead. Inter-device collaboration is also leveraged to gather more visual road-resident information and distill it into knowledge (services) that can be provided to the drivers. With their cameras, a network of collaborating windshieldmounted smartphones can enable a rich set of novel services. In this paper, we focus on the description and evaluation of the SignalGuru service . SignalGuru leverages the cameras of windshield-mounted...
Please join StudyMode to read the full document