It seems to me that the fastest and freely usable AI model type with a open license that is compatible with gpt4all, is a MPT library. The other two model types have legal problems or the ML training ecosystem seems to be currently worse.
A company that offers a commercial training service resulting in MPT models is
MosaicML.
This is their companies organisation card at the huggingface AI platform.
This video shows how the training works with the MosaicML cloud and their service.
Check out the
blog post about their 30b base model. The post includes prices for fine tuning of their 30b based model.
I did some tests for fun on their
MosaicML MPT-30B-Chat.
Their prices are the most affordable ones that i found and their resulting MPT library is technically compatible with GPT4All.
I did mirror the Purebasic documentation and the Tips & Tricks forum already and have them ready to be used for fine tuning of our own purebasic MPT model.
The pro:
- your own ai based chat bot help. How cool is that? A (currently) unique technical feature that clearly says that a product is modern.
- you could use the resulting MPT model for all kind of services based on GPT4All. A web front end, a discord bot.
- an
AI example button for each command. The button would do a request like "show a documented example of using the XYZ command in purebasic" to feed it into a MPT-30B enhanced MPT based AI. That already works without a specially trained MPT library, but with mediocre result quality.
The contra:
- it still costs a few thousand bucks
- the resulting AI does hallucinate. For example in my tests with their raw 30b model, the AI happily included .NET libraries into the answers.
Is it worth it?
I honestly don't know. I can't judge the positive marketing effect for purebasic as a product, the final training costs might be low, but could still end up as being wasted money. The prices will come down - over time - with future GPU cards delivering more power for less energy, but at the same time, other competitors will use AI too.