When software providers talk about the technologies they say “democratize” AI, they also talk a lot about “guardrails.” That’s because the rapidly evolving world of AI tools is still more like a republic governed by the machine-learning elite.
Although no-code and low-code AI tools promise to give everyone a chance to build business analytics models or simple applications that use AI to complete tedious tasks, the amateurs whom no-code AI companies refer to as “citizen data scientists” are often required to play with the bumper rails up. That’s because toolmakers and management are worried about the risks inherent in allowing just anyone to create sophisticated AI systems.
“As you go into low-code and actually more the no-code environment, then there are guardrails as to what you can and can’t do,” said Ed Abbo, president and chief technology officer at C3 AI, which provides software designed to help people with zero coding experience build machine learning models.
Low-code and no-code development tools often are sold as a way to free up the trained professionals to concentrate on bigger priorities. However, businesses hoping to use these systems to build their own AI without the help of data scientists may need one to step in after all, particularly if deploying AI in a way that touches customers.
Databricks, another company that sells no-code machine-learning software, also invokes the guardrails term when discussing the limits of codeless AI tools. “Many data science teams won't approve AutoML solutions for colleagues who are not formally trained in ML unless they feel there are sufficient guardrails,” said Kasey Uhlenhuth, senior product manager at Databricks, where she helps train and build its AutoML tools and machine-learning models.
Kasey Uhlenhuth is a senior product manager at Databricks.Photo: Databricks
No-code AI tools also may not allow for the level of customization everyday business users need and expect when using no-code software development tools.
In the case of C3 AI’s no-code AI software, users are limited to off-the-shelf machine-learning algorithms, said Abbo. “We’ve made available a certain number of machine-learning algorithms and those are the ones they can use, and if they need others they will need to go into a low-code environment to enable those,” he said.
However, “if you’re in a heavy-code environment, that developer is not restricted,” Abbo said. “They have complete freedom to do what they want.”
That freedom was important to Informatica customers seeking more tailored machine-learning models relevant to their specific business operations, products and services, said CEO Amit Walia. Informatica added coding capabilities to its previously code-free tools so customers could include their own coding tweaks to experiment and customize models.
“The reality is there is some amount of coding that can happen, and we respect that and that’s why we added low-code,” Walia said.
Limiting risk
But the limitations imposed on no-code AI tools are about more than just limiting algorithm and coding options. Sometimes restrictions are intended to mitigate risky use of data or development of models that could produce inaccurate or discriminatory results. People who aren’t educated in machine learning might not recognize if data sets they use to train models to detect fraud are deficient, said Uhlenhuth.
“For example,” she said, “fraud detection data sets have severe class imbalance, so it's important to know which metrics and techniques to use to fit the model. If 99% of your dataset has no fraud, then a model that always guesses ‘no fraud’ would be correct 99% of the time.” This sort of problem can emerge without more sophisticated machine-learning practitioners overseeing the process, she said.
Although “making machine-learning easier and easier to use is a great goal,” using no-code tools to help run crucial business operations “is a very risky thing to do,” said Will Uppington, co-founder and CEO of TruEra, which provides software for assessing machine-learning models while in development and in operation.
“For systems that are important to a company’s business [some companies are] actually doing the opposite -- they are digging deeper and understanding the systems more in order to feel they can trust the systems,” he said.
Another data scientist who asked to remain anonymous said that while no-code tools would be useful to help educate businesspeople about how AI models are built, they would not be appropriate to apply for business-dependent purposes. “I can’t see an insurance company adopting something like [no-code AI tools] without deep scrutiny,” said the source, adding, “I don’t think that anything created this way is going to be hugely impactful.”
Many companies providing no-code tools for novices expect them to be used with guidance and intervention from trained data scientists.
"Citizen data scientists typically end up collaborating with a formally trained data scientist before putting models into production. This is almost always true if the model is going to be customer-facing as opposed to an internal tool,” said Uhlenhuth, who added that amateur users typically employ Databricks’ AutoML tool to “experiment with various features and models to see if there is any predictive power before reaching out to the data science team for help.”
Ultimately, some companies supplying low- and no-code AI tools for non-experts encourage more machine-learning education. C3 AI developed a training course over the last year, for example.
“We basically have removed the coding from the equation, but you still need to train the citizen data scientists on what can you do with AI and machine learning, and so we have training programs that can spool up hundreds of business analysts and it takes them through a course,” Abbo said.