It's $20 per month. Here are the features I value and use:
- Others' custom GPTs
- My own custom GPTs
- Integration into email (Thunderbird) and word processing (MS Word for Mac OS)
In many cases, a custom GPT is nothing more than somebody uploading their documents that become the primary information source. The creator can also train the custom GPT by giving it queries and then pointing out mistakes or room for improvement. In some cases, the creator may also work with OpenAI to provide additional capabilities like Wolfram has done. Regardless, the OpenAI subscription gives access to custom GPTs created by self and, if shared publicly, others.
Wolfram's Custom GPT
The Wolfram GPT, for instance, combines the "verbal skills" of Large Language Models with the analytical capabilities of computational AI. As needed, Wolfram's GPT automatically calls an algebra engine or other computational engine and then processes the response. Learn more here.
Since 2016, my coding has been primarily in the Wolfram language. While ChatGPT Plus can be a helpful aid in this coding, my own capabilities surpass it. The Wolfram GPT, however, operates at an entirely different level. This is demonstrated by a time-aggregation function it wrote for me.
Here's another example from Wolfram:
My own custom GPTs
I have several of my own custom GPTs, which are created merely by uploading my own documents. Those docs become the LLM's primary knowledge source. One of them is "Chicago Price Theory Tutor" which is amazing (but not yet shared with the public :)). How smart is this?
Custom GPTs like this can understand and repeat back algebra (see the previous Wolfram screenshot). I did not have to train it in this regard -- just upload a document that had algebra in it. It is terrible at charts or graphs, so I instruct my custom GPTs not to even attempt to make a chart.
Once the Chicago Price Theory Tutor gave a wrong answer. But then I realized that the document I provided was correct but confusing on that topic. The 2nd edition of Chicago Price Theory will explain better.
Integration into email and word processing
The subscription allows you to interact with the LLM via API. You could write your own code to call the LLM, but all I have done so far is use others' code that I prime with my OpenAI API key. The Aify add-on for Thunderbird email adds a button to email-compose windows to perform various LLM tasks on the text therein, such as summarize it, check grammar, recommend edits, etc. "GPT for Excel Word" is an add-on that performs similar tasks in MS Word, and presumably in Excel as well, though I haven't yet tried Excel.
There is an extra charge (beyond the $20/month) for API usage. The extra charge is proportional to the amount of text sent back and forth to Open AI. A busy day for me in this regard generates charges of less than $1. "GPT for Excel Word" levies its own charge of the same order of magnitude.
Accessing the API this way, or other ways, also provides quite a large "context window" that improves results for certain tasks. I think the custom GPTs also, in effect, have a large context window through uploading sizable documents.
No comments:
Post a Comment