The Nature of Sensors

The Nature of Sensors

As sensors and machine learning fuse, they form a critical layer in the synthesis of machine and human intelligence.

Sensors help us perceive and interpret the world. These technologies now allow us to build virtual models of reality, thanks to a revolution in machine learning. The resulting ever-increasing fidelity of these models gives humanity a titan-like power to shape the planet.

This fusion of sensors and machine learning is fundamentally changing intelligence on Earth. It augments biological sense perception. As such, it serves as an essential catalyst in the marriage of human and machine intelligence.

What Does a Sensor Do?

Let’s start by grounding ourselves in a basic understanding of what sensors do. Sensors transduce signals, which is to say that they convert signals from one form into another. That’s why sensors are sometimes referred to as “transducers.” For instance, temperature can be converted into electrical signals, which can then be converted into symbols on a digital thermometer that we use in our kitchen.

Sensors help us to interpret our physical surroundings. In this sense, they are analogous to our senses. Our sense of smell has evolved over billions of years to be able to detect molecular structure and convert it into electrochemical signals that we associate with certain odors, such as an overripe banana. Our skin contains nerve endings that convert pressure and temperature signals into sensations of touch, just as our eyes transform light into visual imagery. With sensors, we use tools to carry out some of this work of interpreting the world around us.

Sense and Sensor-bility

To understand the role of sensors in the emerging synthesis of machine and human intelligence, it helps to distinguish between two approaches to sensing technology. In one case, the goal is to augment our biological senses; in the other, the goal is to replace them.

A telescope augments our eyes rather than replacing them. Telescopes amplify light signals rather than transduce them into another type of signal. As a result, we are able to make sense of that amplified signal with the same visual processing biology we use in everyday sight. In contrast, a thermometer substitutes for the thermoreceptors in our skin. Traditional thermometers transduce the signal from touch to a visual signal, allowing us to read the temperature without burning ourselves.

With traditional telescopes, the signal is boosted but the pattern recognition is still carried out with the human biology of our visual processing system. That’s sensory augmentation. With the thermometer, the signal is no longer captured by the biology of the nerves in our skin. Smart digital thermometers don’t just capture the signal but actually handle the pattern recognition processes through logic—or models—embedded in the device. That’s how it interprets a particular level of molecular vibration as, say, 175℉ and takes the guesswork out of knowing whether your chicken is properly cooked. That’s sensory substitution.

In sensory augmentation, the signal is collected and boosted so that our biological processes can more easily interpret it in our own brain. In sensory substitution, the signal is both collected and transformed—or transduced—the true hallmark of a true sensor.

The Shift from Senses to Sensors

Think of an object, maybe a rubber duck. The light bouncing off of it is data that gets picked up by our eyes and converted into electrochemical signals through the process of sensation. Those signals move through the visual cortex in our brain in the process of pattern recognition that we know as perception.

With sensors, these processes change. Machines take over for biological sensations in capturing signals, while synthetic models take over for the signal transformation of biological perception in interpreting those signals.

This signal transformation was still relatively crude in early sensors. Old-fashioned thermometers mapped the movement of mercury against a range of temperatures (calibrated, variously, by Daniel Fahrenheit and Anders Celsius). You can think of that calibration process as a very simple model for mapping mercury’s expansion against estimates of temperature.

Today’s models look very different. What used to require intellectually challenging work by humans is now being automated by machines. Sensors play a critical role in the building of these new models as well as in putting them to work.

Sensors for Training Machine Learning

Machine learning is how we have automated model building, and it would be impossible without the deluge of data from today’s sensors. The automated capture of data by sensors is what is driving the tsunami of data behind the Information Age.

In model building, sensors sample data from the real world, which is then used to “sculpt” the model through an iterative feedback process. This process happens by generating an initial algorithmic model and then running the sample data through it to see how closely the model maps to the data. What people refer to as “training” the model amounts to automatically tweaking its parameters so that, with each iteration of training, the model gets closer to the reality of the sample data.

The emerging new reality of sensor technology is that it is inextricably linked to machine learning. Sensors still capture data, but now the process of signal transduction has morphed into model building. The original signal is raw data, but through the training processes of machine learning, that signal is transformed—or transduced—into an algorithmic representation of the reality represented in that data. In a sense, this has always been what sensors have done. The calibrations of a thermometer transform the molecular vibrations of mercury into a model representing one variable of reality—temperature. With today’s machine learning models, that signal transduction has just become capable of much more complex representations of reality.

Humanity’s first image of a black hole, brought to you by machine-learning-backed sensors.

Sensors for Applying Machine Intelligence

We’ve seen how sensors provide the data that fuels the training of machine learning models, and now we turn to the role sensors play in putting those models to work. This process of applying machine learning is frequently referred to as inference, and it too relies heavily on sensor data.

An inference is an idea or conclusion that’s drawn from evidence and reasoning.

Inference entails drawing in new data from the environment and exposing it to the model that was generated through earlier training data. This is how machine learning models are deployed into actual applications. Machine learning applications start with a trained model and use that model to make sense of new data. Here, the model acts as a kind of algorithmic reasoning, and the new data serves as evidence. The application applies the model’s reasoning in order to transform the signal of the new evidence into some conclusion.

To make this more concrete, let’s use an actual example. Google has powerful machine learning models that have been trained on huge image collections. They’ve made it possible for me to use these models for my own inference work. I can point my phone’s camera at a coffee cup and it will correctly infer not just that what is before me is a coffee cup but also that it features an image of the Beatles’ Yellow Submarine. In short, the camera sensor in my phone captures light signals that Google’s machine vision models then transform into a new type of signal—human language.

What we’re seeing today is that the transduction role of sensors in transforming signals is becoming increasingly reliant on machine learning. Innovation in today’s camera market, for example, is less about advances in optics, or even in camera sensors, than about machine learning models. In fact, our phones are now brimming with so many sensors that they are starting to function a lot like Spock’s tricorder.

Sensors, Models, and the Future of Intelligence

In short, sensors supply the data for training machine learning models—and for applying them. They allow us to capture data from the world and pull it into a virtual representation of that physical reality. In fact, one way to think about the models we now build with sensor data is that they are contributing to a vast, new, multifaceted virtual reality. Sensors, in short, are helping humans to transform physical reality into virtual reality.

Sensors take on the data-capture functions of our sense organs, just as machine learning takes over the signal transducing functions of our brains that we know as perception. Simple models, like those embedded in early thermometers, still required our interpretation to answer questions like “How hot does this chicken need to get before I can eat it?” The more powerful machine learning models embedded in today’s sensors are taking over more and more of our brains’ work of interpreting data. They can now tell us exactly when that chicken is ready to eat and the origins of that brightly colored submarine.

Models help us to turn our everyday, qualitative experience into a quantitative understanding that helps us interpret, predict, and manipulate the world around us. Sensors and new machine learning models are what allow humans to extend the interpretive capacity of our brains. They help us to augment our imaginative capacity beyond the limitations of our biology. As such, they form a critical layer in an emerging synthesis of machine and human information processing and the future of intelligence on this planet.

Is that what my perception really looks like?

Your comments are welcome here:Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Exit mobile version