The experience that it is much easier to speed up light things than heavy ones is made by children when making the lowest ball of a snowman. But even when rolling such a ball down a slope, a smartphone camera is sufficient to take a sharp picture of it.1 But the smaller the sphere we are looking at, the more important it becomes to know precisely where it is. For many, it makes a big difference whether a raindrop hits the ground or, unfortunately, the person without an umbrella.2
If we now look at much smaller spheres, a standard camera is no longer sufficient to make them visible at all. And if it could, it would not be fast enough to produce a sharp image. In this tiny world, electrons, holes or other (quasi-)particles move on the length scale of nanometers (70,000 nanometers is the diameter of a hair) and need only femtoseconds (a femtosecond is a very, very short time) to change their location. And such a change of location can already make the difference whether a component based on the position of the particles works or not.
We are developing methods, for example, to track charge carriers on these space and time scales through nanostructures. We want to understand why they take certain paths and, at best, steer them in one direction or another in a targeted way. In this way, we gain a better understanding of the dynamics on the nanoscale and lay the foundation for optimising quite everyday things like solar cells.
1 At least this is what we have been told, because here in the Northwest snow is rare and mountain slopes even rarer.
2 A person not prepared for rain must obviously be new to the area.