Credit: Caltech
There are things in life that can be predicted quite well. Tides rise and fall. The moon waxes and fades. A billiard ball bounces around a table according to regular geometry.
And then there are things that challenge the easy prediction: Hurricane that changes direction without warning. Spraying water in a spring. Sleek arrangement of branches growing from a tree.
These phenomena and others like them can be described as chaotic systems and are notable for the occurrence of behavior that is predictable at first but grows more and more random with time.
Because of the huge role that chaotic systems play in the world around us, scientists and mathematicians have long sought to better understand them. Now, Caltech’s Lihong Wang, Professor Bren in the Department of Medical Engineering Andrew and Peggy Cherng, has developed a new tool that can help with this research.
In the last issue of Advances in Science, Wang describes how he used an ultra fast camera of his design that recorded video at one billion frames per second to observe the movement of laser light in a room created specifically to stimulate chaotic reflections.

A video recorded with a camera shooting at one billion frames per second shows how two lasers of laser light take different paths while reflecting inside a chaotic optical cavity. Credit: Caltech
“Some cavities are non-chaotic, so the path that light takes is predictable,” Wang says. But in the present work, he and his colleagues have used that ultra-fast camera as a means of studying a chaotic cavity, “in which light takes a different path each time we repeat the experiment.”
The camera uses a technology called ultra-fast compressed photography (CUP), which Wang has demonstrated in other research that it is capable of speeds faster than 70 trillion frames per second. The speed at which a CUP camera captures video makes it capable of seeing light – the fastest thing in the universe – as it travels.
But CUP cameras have another feature that makes them uniquely suited for studying chaotic systems. Unlike a traditional camera that records a video frame at the same time, a CUP camera essentially shoots all of its frames at once. This allows the camera to capture the entire chaotic path of a laser beam through the room, all in one go.
This matters because in a chaotic system, behavior is different each time. If the camera captures only part of the action, the behavior that was not recorded could never be studied because it would not happen in exactly the same way again. It would be like trying to photograph a bird, but with a camera that can only capture one part of the body at a time; moreover, every time the bird would sit next to you, it would be a different species. Although you can try to collect all your photos in an image composed of birds, that cobbled bird would have the beak of a crow, the neck of a stork, the wings of a duck, the tail of a hawk and the legs of a chicken. Not exactly useful.
Wang says the ability of his CUP camera to capture the chaotic motion of light could give new life to the study of optical chaos, which has applications in physics, communication and cryptography.
“It was a really hot field some time ago, but it went out, probably because we didn’t have the tools we needed,” he says. “Experimenters lost interest because they could not do experiments, and theorists lost interest because they could not prove their theories experimentally. This was a fun demonstration to show people in that area that they finally have an experimental tool. “
The paper describing the search, entitled “Real-time observation and control of optical chaos”, appears in the January 13 issue of Advances in Science. Co-authors are Linran Fan, once from Caltech, now an assistant professor at Wyant College of Optical Sciences at the University of Arizona; and Xiaodong Yan and Han Wang, from the University of Southern California.
Reference: “Real-time observation and control of optical chaos” by Linran Fan, Xiaodong Yan, Han Wang and Lihong V. Wang, 13 January 2021, Advances in Science.
DOI: 10.1126 / sciadv.abc8448
Funding for the research was provided by the Army Research Office Young Investigators Program, the Air Force Scientific Research Office, the National Science Foundation, and the National Institutes of Health.