Series of Sensors Projects feeds Smart City Development
Joel Zysman, IDSC Advanced Computing Director, spoke to a DDN User Group on how Advanced Computing is set up to “follow the data”. The design of the Advanced Computing core is split into 3 core areas: Big Data/Analytics, High-Performance Computing Operations, and Cloud Services. University of Miami storage is not local on the servers, instead, storage is in the middle of the data design. This design led to a series of SENSOR NETWORKS PROJECTS that fed into Smart Cities development.
Sensor Networks Projects
• Long Term Ecological Research Network (LTER)
A partnership with Florida International University, the “Florida Coastal Everglades LTER” sensor-network project analyzed recordings from the Everglades to reveal data on animal population concentrations and movement (sound-based biomass detection). Sound recordings were transmitted via RF communications from self-sustained weather stations. The subsequent analysis led to lots of different grants, and to a progressive series of sensor projects. Next came . . .
• High-Definition Natural Audio Bank
This project also used recordings to find the song of a particular rare bird thought to be living in the on-campus Gifford Arboretum. It led to the creation of an “Interesting Sound Detector.” The Advanced Computing team developed a background baseline noise level, did FFTW analysis to waveform, took a look at the different peaks, isolated their frequencies, and then found the bird’s song (Northern Mockingbird), revealing that a bird that had not been seen in over 30 years was not extinct. The method used was also applicable to study migration patterns. The AC team worked with Music Engineering on this project, and a Sensors Working Group was formed. This project worked so well, it led to . . .
• Traffic Circle Acoustic Survey
A new traffic circle in the City of Coral Gables was receiving noise complaints. Working with the City, the AC team began an 8-week study with multiple receivers that were geopositioned and which streamed data (as opposed to using recordings). In order to stream the sound/data, the team invented a “Listening Box”. The Listening Box was comprised of a Sensor Suite (sound, light color/intensity, temperature, RH, camera), used a Dedicated Sound Processor, and Wi-Fi Burst Data Loading (automated communications). Until this project, the data was written to the DDN Storage, but this project required the ability to analyze the data while streaming. The AC team used MapReduce/Data Analytics to reduce the frequencies down. Their solution applied MQTT (a machine-to-machine internet of things connectivity protocol) and Apache Flume (collects, aggregates, and moves large amounts of log data) to develop a Big Data solution, which enabled them to ingest the Internet of Things data, analyze the data while streaming, and even provide user interfaces. It was revealed that the noise source was the unauthorized use of jack hammers at a construction site (unrelated to the traffic circle). The study pinpointed the source of the noise so well, it led to another noise-analysis project . . .
• No-PII Sound Survey
This low-power, low-cost project set out to analyze student noise at six locations on/off campus, however, HIPAA Best Practices privacy regulations prohibits recording of personally identifiable information (PII). To analyze the noise, the team worked with “Sound Power Levels” instead. Analysis of the frequency window decomposition (300-600Hz), peaks in relationship to sound density, voices that layer together chaotically versus harmonically, orders of magnitude, and “real communications” led to useable results with no “identifiable” data. Using an ad-hoc Raspberry Pi Data Logger/Recorder, the Broadcast Node collected, aggregated, and transformed the data via a Data Hub. The Backhaul used WiFi, Ethernet, BLE, Bluetooth 2, and FM. Processing involved Signal Filtering, Data Encapsulation, and Noise Detection, and finally, Storage included Buffering Data, Baseline Information, and State Information. And it worked! It was determined that 45% of the noise was validated on campus and 65% off campus. Subsequent changes were made to policies, programs, and outreach to successfully address noise levels. This led to another interesting project . . .
• Zenciti, Yucatán, Mexico
The Advanced Computing team (as well as the Software Engineering team) worked on a new Smart City project in the Yucatán peninsula (near Merida, Mexico) called “Zenciti,” planning the Network, the Internet of Things, and the IT design. Upon completion, the team set up an architectural studio where all the data was worldwide, streamed the data through Autocad 3D, built a Visualization wall, and participated in three international shows. While working on this project, the team was approached to create a bus system app: The Yucatán, Mexico Bus Router . . .
• Yucatán, Mexico Bus Router
The Yucatán, as it turned out, had a problem with bus service. They wanted to track all the private and public buses using the buses’ GPS and geolocation data from an app on the cellphones of riders, but not record anything, and have the analytics be in real time. Advanced Computing began by planning a decision-support system with a little bit of Machine Learning and Artificial Intelligence, however, the Big Data solution they used for the Traffic Circle Acoustic Survey project wasn’t going to work for this project. Instead, the team integrated MapReduce Metablocks workflow with an HDFS connector, and used it on the regular DDN high-performance Pegasus system giving faster turnaround than on the dedicated Big Data cluster (this had nothing to do with the processor, it was all IO bottleneck) .This solution gave real-time bus arrival prediction with great accuracy: 90% within 5 minutes of arrival, and 50% accuracy within 1 minute. This solution helped with bus routing data, vehicle supplementation, and efficiency, and resulted in a 30% increase in ridership.
• Miami Beach Listening-City Initiative
The Yucatán Bus Router project was presented at the Smart Cities MIAMI Conference and AC was approached by reps in attendance from the City of Miami Beach. The City wanted to know where people are going—whether on foot, in buses, or on bicycles. The data would be used for emergency response, traffic control, and public transportation analysis in the short term; and for civic planning, public facilities, and paths/lighting in the midterm. Accepting this project meant accepting the data and storing it long term. Also, the challenges faced included that there was no network, no power, sites would be exposed, and it would have to stand alone. To address these issues, they build a Deep Learning Sound Classifier that used:
-
- Spectogram Pattern Matching
- Theano Convolutional Neural Network
- Caffe RNN (Recursive Neural Network)
With a Echolite Senor Array of over a thousand sensors, using AI, they taught the City of Miami Beach to listen. The solution was a single data point, on a single piece of hardware. They integrated a people counter, and helped with notification of movement of a crowd of 50,000 during spring break. They were able to tell when a major pump failed during a storm, and saved the City over $1 million dollars in damages. The project is ongoing.