Maximum Effort, Minimum Reward now has a podcast! I mean, I didn't do any work, but Google did! Hear AI make a podcast explaining an article about AI making a podcast! Are you confused? Me too!
Oh, brother. I've been toying with NotebookLM for about two years now (since soon after its initial launch). It's likely my favourite AI tool.
For my top use cases, I really like the cut of Perplexity's jib, even more than Gemini, but NBLM still feels the most novel, the most fun and usually the most successful in generating the results I need. I'm using the Pro version of Gemini/NBLM and Perplexity, so that may be why I prefer them over ChatGPT and other models (not to mention NBLM's virtually seamless integration with other models/tools).
But damn, is it ever fun. I've used it for Power & Philately (my Substack) but also for my freelance journalism plus a bunch of other incredible and weird use cases. It can get pretty wild. I've found my sweet spot to be at 30-60 sources and about 30-minute audio overviews; anything longer seems to be unwieldy for the model (so far).
Interesting! I tried just uploading chapters one by one from a textbook once and seeing if it could create lectures. The answer was yes-ish. Not didactic enough but not useless either.
Love this! I'm planning on doing the same thing with some of the articles/deep dives I've written. I remember how excited I got when they demo'd this at Google I/O a few months back and I'm glad the actual product produces good results.
Still a fantastic read 10 months after publication. Sweet.
You gotta try it out on your own articles. It’s such a weird experience the first time.
Oh, brother. I've been toying with NotebookLM for about two years now (since soon after its initial launch). It's likely my favourite AI tool.
For my top use cases, I really like the cut of Perplexity's jib, even more than Gemini, but NBLM still feels the most novel, the most fun and usually the most successful in generating the results I need. I'm using the Pro version of Gemini/NBLM and Perplexity, so that may be why I prefer them over ChatGPT and other models (not to mention NBLM's virtually seamless integration with other models/tools).
But damn, is it ever fun. I've used it for Power & Philately (my Substack) but also for my freelance journalism plus a bunch of other incredible and weird use cases. It can get pretty wild. I've found my sweet spot to be at 30-60 sources and about 30-minute audio overviews; anything longer seems to be unwieldy for the model (so far).
Interesting! I tried just uploading chapters one by one from a textbook once and seeing if it could create lectures. The answer was yes-ish. Not didactic enough but not useless either.
Love this! I'm planning on doing the same thing with some of the articles/deep dives I've written. I remember how excited I got when they demo'd this at Google I/O a few months back and I'm glad the actual product produces good results.
Glad you like it! I think I might just do this by default for each of my articles going forward, I really love this feature.