Microsoft partnered with OpenAI and developed GitHub Copilot, which uses GPT-3 algorithm to suggest users with code generation. Many GitHub users have enjoyed Copilot as a pair programmer, and the majority of them embrace the suggestions while coding. However, the SendGrid engineer reported the first bug showcasing the issue of leaking sensitive and functional API keys, thereby giving access to databases.
Sam Nguyen, a software engineer of SendGrid, got a list of secret API keys when he asked the AI tool for the same. API keys are simple encrypted strings useful for accessing databases. The developer opened a request reporting this concern with a screenshot showcasing at least four proposed keys. Github CEO Nat Friedman has acknowledged the issue stating that “these secrets are almost entirely fictional, synthesized from the training data.”
Although Github’s team is working on this issue, it has ignited many open source developers to migrate from Github. Developers envy GitHub Copilot AI and claim this tool uses copyrighted source code in an unauthorized and unlicensed way.
One of the developers said “This product injects source code derived from copyrighted sources into their customers’ software without informing credits of the licensed source code. This significantly violates the terms of copyright holder’s work.” Currently, Microsoft has released a public version of Github Copilot, which is trained from codes from the public repositories of GitHub.
It laters plans to release a commercial product version, supporting enterprises in understanding their programming styles. This AI technology will not only be limited for Microsoft as OpenAI CTO Greg Brockman said “they will be releasing Codex model this summer for third party developers to tailor their own application.”