October 12, 2021
Louise Lerner, Rob Mitchum, and Maureen McMahon
From an expedition to hunt for fossils in the deserts of Wyoming to building a virtual reality headset, University of Chicago Physical Sciences Division graduate students were exploring a range of questions during the Summer Quarter. Here is how six students spent the summer.
Fossil hunting in Wyoming, Melissa Wood, Geophysical Sciences
In the high deserts of southwestern Wyoming, graduate student Melissa Wood went on the hunt for 50-million-year-old rodents.
Wood, a paleontologist in the Department of the Geophysical Sciences, is particularly interested in studying how populations of ancient mammals changed over time as the climate fluctuated.
“For example, in ancient times of global warming you often see species get a little smaller, because when you’re smaller you have greater surface-to-volume ratio and can give off heat better,” she said. “You also often see a crash in species diversity, and we want to look at how exactly that works—which species in the food chain disappear, which change, how they recover.”
She particularly focuses on the period called the Eocene, about 56 to 34 million years ago. In North America during this time period, there are already early primates, rhinos, tapirs and many others, including many species of rodents.
“It’s a really interesting time for mammals; it’s been about 10 million years since the dinosaurs were wiped out, so mammals are starting to take off,” she said. “And climatically it’s a really interesting time because the start of the Eocene is also a very rapid global warming event.”
This was Wood’s first year leading her own fossil expedition. She chose the Washakie Basin, just below the continental divide, which has relatively underexplored deposits of mammal fossils from the Eocene. It will be a multi-year process; the first year is largely dedicated to mapping the deposits and working out the ancient environment that each bed represents. “The field has come a long way since these beds were last mapped; you can get much greater accuracy these days,” she said.
In successive years, she will begin collecting and cataloguing in earnest. But she’s already found many fossils—“Teeth are often sitting on the surface; you can pick them up”—and will identify and catalogue them back in Chicago. This generally involves examining tiny fossil rodent teeth under a microscope, to map them against known existing species.
The Field Museum has a collection from this area, from University of Chicago Prof. William Turnbull’s fieldwork in the 1960s and 70s, so Wood will cross-reference her findings against the older collections. (After she completes her research, all her specimens will be housed at the Field.)
Wood is pleased with the expedition; early findings suggest there are rich mammal deposits over a long time period. “I think there’s a lot to learn there,” she said.
Engineering a haptic sensory device, Shan-Yuan Teng, Computer Science
The frontier of augmented reality has a touchy-feely problem. With the right headset device, users can fill their visual field with graphics, menus, and other interfaces. But in order to interact with those virtual objects, the user must wear often-bulky devices on their hands and fingers that restrict their ability to manipulate the real world.
Shan-Yuan Teng wants tomorrow’s augmented reality users to have it both ways. With his colleagues in the Human Computer Integration Lab directed by Assistant Professor of Computer Science Pedro Lopes, Teng invents wearable devices that allow for smooth transitions between the virtual and real worlds.
One such technology, Touch & Fold, sits on the fingernail and provides haptic feedback when the user presses a virtual button or toggles a virtual switch. When not in use, the fingerpad retracts so that the wearer can feel the real objects during AR-assisted tasks such as repairing a bicycle or learning an instrument. The paper describing the device, co-written with Pengyu Li, Romain Nith, Joshua Fonseca, and Lopes, received honorable mention for the best paper award at the 2021 edition of CHI, the premier human-computer interaction conference.
This summer, Teng performed remote demos of Touch & Fold for HCI conferences around the world, playing a virtual piano while viewers saw the mixed reality environment from his point of view.
He also spent the summer building his next invention with master’s student Yujie Tao and Lopes, another finger-mounted device that frames that user’s fingerpad and creates the illusion of softness when touching physical objects. While wearing the technology, a user could perceive a hard 3D-printed teddy bear as soft to the touch, Teng said. It’s all part of making mixed reality feel more tangible, in startling new ways.
Inventing a VR headset to restore smell, Jas Brooks, Computer Science
Of the five basic senses, smell has received the least high-tech attention. But scents have immense power to influence emotions and experience — a potential that perfume makers figured out long before scientists. In the Human Computer Integration Lab, Jas Brooks has explored the impressive ability of smells and the human olfactory system to enrich everything from virtual reality to video games and film.
If it doesn’t sound like your typical human-computer interaction research, that’s only because few devices exist to influence people’s sense of smell. So Brooks has invented new technologies, such as a VR headset attachment that sprays chemicals into a user’s nose to create temperature illusions of hot and cold environments, and a nose ring that electrically stimulates the trigeminal nerve to communicate potentially hazardous but imperceptible fumes.
This summer, Brooks and their labmates worked on clinical partnerships to help test this “smelling aid” device in patients who have lost their sense of smell. Brooks has also been developing new 3D-printed prototypes that enable easier scientific investigation of scent and its applications. This summer, Brooks’ stereo smell project was named as an honorable mention in the Experimental category in Fast Company’s 2021 Innovation by Design Awards.
The tinkering was inspired by Brooks’ fascination with scratch-and-sniff printing and its use in combination with other media. In February 2021, they curated a “Twitch and Sniff Along” series with the Weston Game Lab, recreating the smell cards distributed with the games Leather Goddesses of Phobos and Leisure Suit Larry: Love for Sail! and conducting live playthroughs on Twitch. This summer, Brooks moderated a panel discussion on “Participatory Scented Cinema,” and they’re working with Doc Films on curating a series of films accompanied by scent, from 60’s Smell-O-Vision through modern art experiments adding smells to pre-existing movies. The events are both fun and useful for research into the subjective nature of scent, Brooks said.
“People individually have a hard time expressing themselves about smell, but there's something interesting that happens when a group talks about a smell, because they each have a viewpoint,” Brooks said. “Having all of these different viewpoints being discussed creates a better understanding of what the smell is, and a lot of these playful interactions allow that communication with the audience.”
Designing an app to predict baseball pitch-types, Evan Boyd, Nic Carlson, and Kelley Monzella, Master of Science in Data Analytics Program (MScA)
Sign stealing in baseball has long been an accepted part of the game. But when the Houston Astros defeated the Los Angeles Dodgers in the 2017 World Series, all eyes were on a team that got ahead by illegally using a video camera to steal signs. The controversial use of technology to replace learned guesswork resulted in the firing and suspensions of managers in the Astros franchise, and across Major League Baseball.
Evan Boyd, Nic Carlson, and Kelley Monzella, students in the University of Chicago Master of Science in Data Analytics Program (MScA), were intrigued by the challenge to develop an app to predict pitch types in Major League Baseball. If possible, the data it would gather could inform in-game decisions, help with pre-game studies, and support real-time betting insights—all without breaking baseball’s rules.
The team began discussing the most effective data science approach to resolve the issue in December. A catalog of pitch-by-pitch information from the 2019, 2020, and 2021 seasons informed the creation of a model. “These features include the pitcher and what pitch types they typically throw, the batter, plus in-game information such as the inning, batter’s count in the plate appearance, and number of outs in the inning,” they explained. Many models were tested that reaffirmed existing research or were novel to the field.
Once they created accurate models to predict pitch types, they used the computer programming languages R and Python to build an application that automatically predicts pitches during a game in progress and allows clients to specify which features to track.
The students said the process of writing a successful app dramatically improved their Python skills, while classes like Machine Learning and Data Mining helped them navigate building sound models to generate predictions.
“The processes of this entire project—from scraping and cleaning data to modeling to applying it—are the building blocks towards an effective data science study,” the authors wrote.
When they presented their research to three prominent data scientists in August at the MScA showcase of Capstone Projects, they “approached the presentation as if we were on the show ‘Shark Tank,’ presenting a new approach to a known topic in baseball to prospective clients.” Their pitch came in second.