(Auto) pilot
I've just been scanning an article in the April issue of Wired magazine about an OpenAI/GitHub project called Copilot. To the cognoscenti, this probably qualifies as old news, but it had passed me by as I've taken a less than avid interest in software development in recent months. This just might have changed my mind...
I don't generally have a great deal of trust in the pursuit of Artificial Intelligence per se, as I will always believe that human intelligence, guile and adaptability will always best a machine, even one of the greatest human mind's devising. But along the way, we can make some useful tools - one of our species' greatest abilities: tool-making - that can also generate more useful tools.
It seems that what the team behind Copilot achieved inadvertently, whilst trying to emulate human language and responses, was to allow the AI access to data sets they hadn't thought of including. In scraping the internet for data, they accidentally allowed their software to analyse not just natural language, but programming languages in the form of extant code scattered about the net.
The damned software not only learned to emulate natural language, but it taught itself not just to autocomplete code for programmers, but actually to code large chunks of programs for itself, based on scant prompts from programmers: and in multiple languages. I find this aspect of the 'accident' fascinating and slightly spooky in equal measure. Sufficiently so to actually sign up to the waiting list for early public access to this stuff. At sixty-seven, I'm still interested enough in tech to occasionally get my hands dirty in the abstruse world of software.
Comments
Post a Comment