• Possible to turn ChatGPT, Bard, or any LLM from an amnesiac goldfish into a memory mammoth!

  • 2023/07/23
  • 再生時間: 1 時間
  • ポッドキャスト

『Possible to turn ChatGPT, Bard, or any LLM from an amnesiac goldfish into a memory mammoth!』のカバーアート

Possible to turn ChatGPT, Bard, or any LLM from an amnesiac goldfish into a memory mammoth!

  • サマリー

  • What if your ChatGPT, Bard, or any other LLMs could remember the things you said last month? For example, if you were planning to buy a home near your kids' school, the story you told your son last week, or the gift ideas your wife wouldn't like. That would be amazing, right? However, with the current short memory limit, also known as the context window, these capabilities remain a dream.

    But Dr. Burtsev's research has come to our rescue! Thanks to his breakthrough, your LLMs can now accurately remember 1 million tokens, equivalent to several books' worth of information. We are now much closer to having a dreamlike Chat Agent.  
    Want to know more about what this means for the rest of us? Just tune in to this podcast where Mr. Bursev joins us to discuss his inspiration, how he made it possible, and many other insightful thoughts and ideas about interactive learning, human brain-inspired machine learning algorithms, AGI, Turing test, and, of course, AI safety.


    Here is the paper: Scaling Transformer to 1M tokens and beyond with RMT 
    https://arxiv.org/abs/2304.11062
    Find Mr Burtsev's profile here
    https://lims.ac.uk/profile/?id=114
    Here are various other resources mentioned during the show:
    Mike's Linkedin page and information about the IGLU contest (Interactive Grounded Language Understanding)
    https://www.linkedin.com/posts/mikhai...
    The Society of Mind, Marvin Minsky
    https://isbndb.com/book/9780671657130
    The Human Brain Project
    https://www.humanbrainproject.eu/en/b...
    Yann LeCun, JEPA: A Path Towards Autonomous Machine Intelligence
    https://www.reddit.com/r/MachineLearn...
    Mindstorms in Natural Language-Based Societies of Mind
    Jürgen Schmidhuber

    続きを読む 一部表示

あらすじ・解説

What if your ChatGPT, Bard, or any other LLMs could remember the things you said last month? For example, if you were planning to buy a home near your kids' school, the story you told your son last week, or the gift ideas your wife wouldn't like. That would be amazing, right? However, with the current short memory limit, also known as the context window, these capabilities remain a dream.

But Dr. Burtsev's research has come to our rescue! Thanks to his breakthrough, your LLMs can now accurately remember 1 million tokens, equivalent to several books' worth of information. We are now much closer to having a dreamlike Chat Agent.  
Want to know more about what this means for the rest of us? Just tune in to this podcast where Mr. Bursev joins us to discuss his inspiration, how he made it possible, and many other insightful thoughts and ideas about interactive learning, human brain-inspired machine learning algorithms, AGI, Turing test, and, of course, AI safety.


Here is the paper: Scaling Transformer to 1M tokens and beyond with RMT 
https://arxiv.org/abs/2304.11062
Find Mr Burtsev's profile here
https://lims.ac.uk/profile/?id=114
Here are various other resources mentioned during the show:
Mike's Linkedin page and information about the IGLU contest (Interactive Grounded Language Understanding)
https://www.linkedin.com/posts/mikhai...
The Society of Mind, Marvin Minsky
https://isbndb.com/book/9780671657130
The Human Brain Project
https://www.humanbrainproject.eu/en/b...
Yann LeCun, JEPA: A Path Towards Autonomous Machine Intelligence
https://www.reddit.com/r/MachineLearn...
Mindstorms in Natural Language-Based Societies of Mind
Jürgen Schmidhuber

Possible to turn ChatGPT, Bard, or any LLM from an amnesiac goldfish into a memory mammoth!に寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。