Tesla CEO Elon Musk bygone appear that he’s throwing a “Super Fun AI Party/Hackathon” with Tesla engineers and a select group of invitees.

This comes on the heels of last week’s fourth division balance call, where Business Insider letters Musk accepted Tesla was at least a “few months” from rolling out “feature complete Full Self Driving” (FSD).

There’s a lot to unpack here, so let’s start with the timing of the party announcement.

Elon Musk told Tesla shareholders last February that the aggregation would have affection complete FSD by the end of 2019.

According to The Verge, Musk told reporters during a call last February that there were few caveats to the release:

Now first of course that this this will need to be supervised by the drivers because it will take us billions of miles to get to the safety level where driver ascertainment is no longer required. And then we will need to argue regulators of this, so there are some steps along the way… But as I said before I’m assertive we’ll absolution full self-driving this year.

Musk back-tracked on his comments in October of 2019 when he was forced to admit Tesla would miss its 2019 deadline. Fast advanced to last week’s call and here we are: Elon’s going to throw a good old ancient hackathon to solve free driving. We assume there will be punch and cake.

Full Self Active is the name of a Tesla software amalgamation that Musk claims will give absolute Tesla automobiles the adeptness to drive from a residential area, onto a highway, and through city streets after any human intervention.

In his words, according to Elektrek:

Yeah, feature-complete, I mean, it’s the car able to drive from one’s house to work, most likely after interventions. So it will still be supervised, but it will be able to drive — it will fill in the gap from low-speed freedom with Summon. You’ve got accelerated freedom on the highway, and average speed autonomy, which really just means cartage lights and stop signs.

So feature-complete means it’s most likely able to do that after intervention, after human intervention, but it would still be supervised. And I’ve gone through this timeline before several times, but it is often misconstrued that there’s three major levels to autonomy. There’s the car being able to be autonomous, but acute administration and action at times. That’s affection complete. Then it doesn’t mean like every scenario, everywhere on earth, including every corner case, it just means most of the time.

The goalposts, however, have confused decidedly since Musk made the above statement.

Per the Business Insider report, Musk told reporters on the balance call that the company’s now aiming for article a little less ambitious:

Feature complete just means it has some chance of going from your home to work, let’s say, with no interventions. It doesn’t mean the appearance are alive well.

It appears as though Musk and Tesla are alleviative the botheration as a computer vision problem. A recent report from Teslarati quotes Musk as explaining:

I think that’s attractive like maybe it’s going to be couple of months from now. And what isn’t accessible apropos Autopilot and Full Self-Driving is just how much work has been going into convalescent the basal elements of autonomy.</p>

In term>s of labeling, labeling with video in all eight cameras simultaneously. This is a really, I mean in terms of labeling efficiency, arguably like a three order of consequence advance in labeling efficiency. For those who know about this, it’s acutely fundamental, so that’s really great advance on that.

It’s likely that Musk is apropos to the botheration of labeling altar in the real world – a computer vision challenge. This indicates that Tesla hopes to brute-force its way into city active by training neural networks to admit every accessible object it could appointment in a city and act accordingly.

One abeyant flaw in this strategy: things aren’t always what they seem.

webrok

Outside of Tesla, the accepted archetype for self-driving analysis involves the admittance of city basement to abutment free agent technology.

The reason why we can’t currently summon a driverless Uber to drive us around town isn’t because regulators are scared to let the robots loose on our city streets, it’s because the botheration can’t be solved with just computer vision and LIDAR.

Read next: Artist fakes Google Maps cartage jam with 99 phones