And then you've got to then process that in the car with a very small amount of compute power. So, it's all doable and it's happening, but it is a different problem than what, say, a Gemini or OpenAI is doing. And now part of the way you can make up for the fact that the inference computer is quite small. It is by spending a lot of effort on training.
And just like a human the way you train on something, the less metal work takes when you try to -- when you do it, like when the first time like a driving it absolves your whole mind. But then as you train more and more on driving different than the driving becomes a background task. It doesn't -- it only solves a small amount of your mental capacity because you have a lot of training. So, we can make up for the fact that the insurance computers -- it's tiny compared to a 10-kilowatt bank of GPUs because you've got a few hundred watts of inference compute.