By AI Trends Staff
At its Build conference for developers held recently, Microsoft unveiled the first features in a commercial product powered by GPT-3, the natural language model developed by OpenAI, the AI research lab backed by Microsoft.
The beta version of GPT-3, released in June 2020, has a capacity of 175 billion machine learning parameters, making it the largest language model that uses deep learning. The largest language model available before GPT-3 was Microsoft’s Turing NLG, with a capacity of 17 billion parameters.
Microsoft reached an agreement with OpenAI in September 2020 to license GPT-3 for its own products and services. In July 2019, Microsoft and OpenAI had announced a partnership that included a $1 billion investment from Microsoft for work that included building a supercomputer on Microsoft’s Azure cloud platform, according to an account from InfoQ.
Originally formed as a non-profit, OpenAI moved to a hybrid model in March 2019, with the intent to raise investment capital. Microsoft has referred to its relationship with OpenAI as “exclusive”.
“Low Code” Tool Aimed at “Citizen Developers”
At Build, Microsoft GPT-3 will be integrated in Microsoft Power Apps, which it describes as a “low code” application development platform, aimed at users with little or no coding experience. These users are also described as “citizen developers,” meaning they create application capabilities for use by themselves or others, working outside the IT department, often reporting to a business unit.
For example, the new AI-powered features will allow an employee building an e-commerce app to describe a programming goal using conversational language like “find products where the name starts with ‘kids.’” A fine-tuned GPT-3 model then offers choices for transforming the command into a Microsoft Power Fx formula, the open source programming language of the Power Platform, according to an account on the Microsoft AI blog.
Microsoft runs GPT-3 on Azure using Azure Machine Learning. Power Fx is built on Microsoft Excel.
“Using an advanced AI model like this can help our low-code tools become even more widely available to an even bigger audience by truly becoming what we call no code,” stated Charles Lamanna, corporate vice president for Microsoft’s low code application platform.
He noted that Microsoft’s agreement with OpenAI allows it to license the code behind the GPT-3 model that allows it to integrate the technology directly into its products. “This will allow people to query and explore data in ways they literally couldn’t do before,” stated Lamanna.
With the new features tapping GPT-3, a user can type plainspoken language such as: “Show 10 orders that have stroller in the product name and sort by purchase date with newest on the top,” to produce a fairly complex formula.
The user still needs an understanding of the code they are implementing; the features are designed to assist users of the Fx programming language to choose the right formulas to get the results they need, according to Microsoft. The new features announced at Microsoft Build, the company said, will be available in preview in the English language throughout North America by the end of June.
Microsoft’s goal is to widen the pool of users able to use the tool. “This isn’t at all about replacing developers, it’s about finding the next 100 million developers in the world,” stated Lamanna.
Risks of Large Language Models
Large language models get their capacity from studying language patterns found from scraping essentially all the text data available on the internet. Thus, the models pick up sexist, racist and abusive language along with everything else. “The text they produce can be toxic in unexpected ways,” stated a recent account in The Verge.
Microsoft has created constraints aimed at minimizing the risks of GPT-3, “but the core of the program is still based on language patterns learned from the web, meaning it retains this potential for toxicity and bias,” suggested the account in The Verge.
In an interview, Microsoft’s Lamanna told The Verge that the company is working to address this risk, such as by implementing a list of words and phrases the system will not respond to. The relationship between making the program safe to use and limiting its functionality is a challenging tradeoff. “Like any filter, it’s not perfect,” Lamanna stated.
The users will need to confirm any formula written by the AI, he said. “The human does choose to inject the expression. We never inject the expression automatically,” he stated.
GPT-3 License Seen Giving Microsoft an Advantage
The worlds of scientific research and applied AI are colliding around Microsoft’s efforts to commercialize GPT-3, suggests an account from TechTalks written by Ben Dickson, the site’s founder.
“There’s a clear line between academic research and commercial product development. In academic AI research, the goal is to push the boundaries of science. This is exactly what GPT-3 did,” Dickson stated. In commercial product development, “You must solve a specific problem, solve it ten times better than the incumbents, and be able to run it at scale and in a cost-effective manner.”
The OpenAI-Microsoft alliance makes sense for both companies. “OpenAI would have a hard time finding a way to enter an existing market or create a new market for GPT-3,” Dickson stated. “On the other hand, Microsoft already has the pieces required to shortcut OpenAI’s path to profitability.”
Microsoft has reach across different industries, thousands of organizations and millions of users of Office, Teams, Dynamics and Power Apps. “These applications provide perfect platforms to integrate GPT-3,” Dickson stated. And with its exclusive access to the code and architecture of GPT-3, Microsoft has an advantage over competitors. “Whatever use case any company finds for GPT-3, Microsoft will be able to do it faster, cheaper, and with better accuracy thanks to its exclusive access to the language model,” Dickson stated.