Across the University of Miami’s three campuses, faculty and students are immersed in research aimed at providing shape and solutions to some of the world’s most pressing issues. Helping to solve some of the riddles is the use of data science—big batches of information filtered through high-performance computers that aid researchers in their efforts to decipher complex formulas and create predictive applications. Here’s a look at some of the ongoing work:
Rosenstiel School of Marine and Atmospheric Science
• Professor Ben Kirtman and assistant research scientist Leo San Pedro Siquiera (pictured below) are constantly working to refine their high-resolution climate simulation, which uses large data sets. This model enables them to visualize very small features in the ocean called fronts and eddies throughout the entire world, providing unprecedented detail in their forecasts.
In order to isolate climate change from naturally fluctuating weather cycles, Siquiera and Kirtman use several iterations of the same time period. This generates massive amounts of data, yet allows them to make climate change predictions and sea level change forecasts that are more precise than earlier models. From the models, Siquiera said they have found that sea levels are rising at a rate of 3 millimeters per year, and that by 2100, the world can expect a two- to six-foot increase in sea levels, depending on carbon emissions and local land elevations. Yet, their model also allows scientists to get regional details about the extent of the water level rise.
“Sea level changes are uneven across different regions and can deviate substantially,” Siquiera said. “This is the frontier of climate simulation because we have access to unparalleled high-performance computational resources. There are very few other efforts worldwide that are doing these kinds of simulations.”
• Professor Claire Paris-Limouzy of the Department of Ocean Sciences focuses on dispersion in theocean. She developed a computer application called the Connectivity Modeling System to track planktonic organisms and pollutants in the ocean. To truly understand where they move, Paris-Limouzy must also use ocean circulation models. These models provide the flowing currents that disperse and transport plankton, fish larvae, and chemicals throughout the sea. Paris-Limouzy’s models help isolate where eggs and larvae mature and join fish populations. The models can also further forecast changes in different environmental conditions so that the National Oceanographic and Atmospheric Administration (NOAA) and other agencies may use the data to determine fish quotas, as well as areas of protection for marine life.
Paris-Limouzy said her modeling system relies heavily on the supercomputer because it pulls information from such large, varied data sets that would take hours to process without a high-powered system. “Modeling of plankton or pollutants at sea can only be run effectively on the supercomputer,” she said. “Otherwise, it would be impossible to combine complex interactions between live particles and the ocean’s currents analytically. You need a system capable of tracking millions of particles with individual behaviors from many locations in three spatial dimensions and through time.”
• Professor Villy Kourafalou, in the Department of Ocean Sciences , creates circulation models to better understand how the ocean’s currents are flowing. These high resolution models of the Gulf of Mexico and the Florida Straits can offer predictions one week in advance and in extremely fine detail, which has given them an advantage above other models. The models also predict temperature changes in the water, the salinity of the ocean and sea level changes. This information can be useful for navigation, search and rescue, tracking pollution, fisheries, and hurricane prediction, Kourafalou said. However, without access to the supercomputer, these models would not exist, she said.
• Professor David Nolan, in the Department of Atmospheric Sciences, uses data from actual storms to create three-dimensional computer simulations of hurricanes and tornadoes, so he and his team can investigate what makes them stronger or weaker, as well as why they are propelled in different directions. Nolan says one benefit of hurricane simulations is that scientists can see the wind damage that hurricanes of a certain capacity can inflict, as well as what type of fluctuations in wind speed can happen during a storm. Another benefit is that researchers can see what happens inside a hurricane, and can get a cross-sectional visual of the storm to any level of precision they prefer. They are currently working on trying to understand the horizontal structure of the wind field in hurricanes, and Nolan said in the future, this technology could help pinpoint what wind speeds people could expect at their own homes.
“The data sets from these hurricane simulations are very large—one is 4 terabytes—and are widely used by other people. But they provide an extremely realistic process of a hurricane,” Nolan said.
Photos: TJ Lievonen/University of Miami
College of Engineering and College of Arts and Sciences
• Electrical and computer engineering professors Kamal Premaratne and Manohar Murthi, along with their interdisciplinary U-Link team, which includes associate professor of computer science Stefan Wuchty, are investigating extremist groups to determine why some people are enticed to align themselves with these groups to commit violence and to spread hate speech. They are using data from the social networks Facebook and Twitter, as well as their own network science algorithms and machine learning techniques to track how this content spreads, Murthi said.
“We are trying to understand the process by which extremist groups propagate. Why do they resonate with some and not others? And how do they spread on social networks?” Murthi said. His collaborator, Premaratne added: “Our challenge for data science is to scale the algorithm so that it does not fall apart because of these large-scale data sets.”
• Professor of computer engineering Mei-Ling Shyu is working on several projects that utilize data science to solve real-world problems. The first, which she is working on with graduate student Saad Sadiq, aims to detect fake news by capturing the complex hidden relationships in natural language. Their machine learning method uses data from fact-check websites and Google’s 100 billion-word dataset to identify satire, sarcasm, and purposefully misleading content. The program has even garnered awards in an international fake news competition, Sadiq said. Machine learning programs train the computer to “learn” how to classify objects based on certain traits; the more examples they are given, the better the program works.
Shyu is also working on a program to help first responders react to natural disasters quicker. The program uses deep learning, a more complex version of machine learning, to crawl dozens of websites (especially social media sites) and cross reference posts with GPS data simultaneously for information about a natural disaster, so that responders can locate the worst areas of destruction as quickly as possible. “In any disaster, we need a way to aggregate the data…and the question is how we utilize all this information on the internet, to provide some kind of situation awareness for this?” she said. “[With this technology] you would have an automated process to show this information to the necessary people at the right time. We could respond more quickly because we could get this information more immediately.”
• Shyu and psychology professor Daniel Messinger are collaborating to develop a more objective way to identify children with autism. Currently, psychologists and psychiatrists use a list of symptoms and characteristics to diagnose children on the autism spectrum, however, Messinger and Shyu are working to develop a computer program that would use classroom videos to predict the likelihood of autism based on certain characteristics found in children.
“We are trying to use machine learning methods to assess them electronically, to automate the process,” Shyu said.
• With ophthalmologist Dr. Richard Lee at the Bascom Palmer Eye Institute, Shyu is working on a machine learning program that would process eye images to look for irregularities more efficiently than an individual radiologist scanning each one. “This would highlight or signal if there is a problem to save a lot of human labor time,” she said.
• Assistant professor Zheng Wang in the Department of Computer Science is teaching students to conduct research using three-dimensional structures of a single cell of DNA. This cutting-edge genomics strategy allows scientists to view the structures of all chromosomes—essentially meshed up strands of DNA packed in the nucleus of a cell—as well as how chromosomes interact with each other. These models can help scientists understand how neuron cells may be repaired and regenerated after injuries, Wang said.
(Photo at left: 3D models of two X-chromosomes from a single cell of a female human being. Left is inactivated, while right has a looser structure allowing normal gene expressions.)
In the past, Wang said scientists would simply use an average to determine the locations of DNA, but with the growth in biochemistry techniques and computer algorithms, they have been able to get more accurate depictions, down to a single cell.
“3-D structuring is a new dimension of understanding how the genome functions and may aid the understanding of disease and finding cures for diseases,” Wang said.
School of Architecture
• UM School of Architecture’s Responsive Architecture + Design Lab, also known as the RAD lab, is tapping into the power of data to build smarter homes, buildings, and cities.
In the lab, Chris Chung and a team of students work on prototypes that use data and the internet to solve challenges that people face every day. Many of the projects the RAD lab is working on take data collected by sensors and apply them to their respective prototypes and projects, so that everyday items become “smart” or “interactive.”
For example, Chung’s lab helped design solar panels that will angle themselves closer to the sun’s rays based on a person’s physical activity for a proposal to enhance nearby Metrorail stations. He has also collaborated with students on a project called Robotic Cloud, where they designed shade drones that would keep Miamians in the Design District out of the heat using sensors that detect the person’s movement. The lab also created ParkShare, an app that would organize and manage the number of parking spaces available in a given garage, to let people know if and where they can park before they spend time circling a lot. And one of the lab’s largest projects, ZenCiti, would allow residents to track their daily usage patterns of electricity, water, and other utilities and then gain better parking spots based on their conservation.
Miller School of Medicine
• Professor Vance Lemmon at the Miller School of Medicine, who shares his lab with John Bixby, vice provost for research, works to find a cure for paralysis by examining nerve cells and using different chemicals to stimulate growth. To do this, Lemmon and his team of researchers have recently added three tools that generate lots of data, but also help them to identify genes and create 3D models of the brain and spinal cord. These models help them to visualize treatments that may help reduce paralysis.
“Without 3-D imaging, we never would have seen the branching and complex paths of regenerating axons (or nerve fibers),” Lemmon said. (pictured at left: Members of the LemBix lab at The Miami Project to Cure Paralysis discuss gene expression studies from several thousand different cells in a mouse brain that emerged from next generation sequencing experiments. Photo: Rob Camarena/Miller School of Medicine)
The process starts with a next generation DNA sequencing machine, which helps Lemmon’s team find the best genes to enhance nerve regeneration. Then, a phenotypic screening microscope allows the scientists to test hundreds of genes, or tens of thousands of chemicals (i.e. future drugs) in individual cell cultures by automatically taking and analyzing hundreds of thousands of images. Finally, the team tests the genes or chemicals that stimulated axon growth using animal models in the Ultra microscope. This tool takes hundreds of images of a mouse brain or spinal cord injected with ideal genes or chemicals. Those images are then pieced together in 3D models, which helps the team to see whether the genes they have uncovered will actually regenerate axons of nerve cells in living animals. If the axons do regenerate, there is hope that they will reestablish the neural connections that were lost in an injury, permitting movement in areas that were previously paralyzed.
“A long time ago, we thought that axons, the long nerve fibers that connect the brain to the spinal cord, grew straight, and after injury nothing was growing. But by using the 3-D reconstructions, we discovered that axons were actually growing a lot, just in a knot,” Lemmon said. “So if we were using these old school methods [without much data] we were misunderstanding how good the regeneration was… It was a shock to find. Now we need to figure out how to exploit that.”
SOURCE: NEWS@TheU by Janette Neuwahl Tannen https://features.miami.edu/2019/plotting-the-future-through-data/big-data/snapshots-of-data-science-in-action/index.html