There was a crash when a tractor trailer turned in front of a Tesla using autopilot. That was an autopilot system, not a self-driving system. Drivers were suppose to serve as a backup system.
The idea was to get a partial self-driving system (lane keeping, adaptative cruise control, and automatic braking) out and let hundreds of drivers test it in real world conditions. This driver found a flaw in the system but through his inattention allowed it to become a fatal flaw.
Following that crash Tesla modified their sensor system.
Tesla's plan is to put hundreds of thousands of EVs on the road for at least a year with their self-driving systems observing what drivers encounter so that Tesla can assure that they've identified as many unusual problems as possible and modify their systems prior to letting their cars drive themselves.
Tesla's can drive themselves now. Here's a video of a Tesla driving itself. https://youtu.be/eAal0juXXzU
You'll notice that the person sitting in the passenger seat keeps tapping the steering wheel with his fingers. That's a requirement that Tesla has put in their self-driving systems for now. If the driver does not tap often enough the system turns off. It increases the odds that the observer-rider is paying attention.
Over a year Tesla will gather data on billions of miles of driving, map most roads in the US, and find as many "turning trailer" problems as they can.
We should never expect self-driving cars to be 100% accident free. There's always going to be a deer that leaps from behind a large rock just as the car arrives. Or a piano that falls from snapped cable immediately in front of the car. Or a sinkhole that opens under the car.
What we can expect is that self-driving cars will be significantly safer than human drivers. Our odds of avoiding an accident won't fall to zero, but should fall to more than 1/10th what they are with human drivers.