Might be helpful for those that
- don’t have access to hardware that can run things locally
- understand the benefits and limitations of generative AI
Link: https://duckduckgo.com/?q=DuckDuckGo&ia=chat
As a nice coincidence, one of the first results when I searched for a news update was this discussion:
https://discuss.privacyguides.net/t/adding-a-new-category-about-ai-chatbots/17860/2
Honestly, I’m really impressed… Ddg works across VPN, and TOR. That includes these chatbots. That’s a great improvement for privacy
Same. Genuinely impressed
Is there a YouTube video under 10 minutes that compares the different AI models available from DuckDuckGo?
I use mixtral8x7b locally and it’s been great. I am genuinely excited to see ddg offering it and the service in general. Now I can use this service when not on my network.
Dunno, but Llama 3 is the best open source model and Claude 3 is the best overall model they offer.
You provided no reasoning but I choose to just believe you. Thank you wise person in the Internet.
It still amazes me just how quickly these models can spit out complex answers pretty much instantly.
…complex almost entirely wholly-hallucinated answers that only have as much bearing on reality as ‘some dude who is very talkative and heard about a bunch of stuff second-hand, and who is also high as balls and experiencing a manic episode where they think they know everything’
LOL. Yeah, sometimes, answers can be very much “I’m winging it today”, but certain prompts, especially for story ideas, can be very interesting and usable.
I’ve always said that if you know a lot about a subject, you can easily spot how AI generally tries to fake it until it makes it.
But if you have no idea about something, the answers you get are certainly better than what your buddy might tell you 😂
But to my point, it comes up with long form content so fast that you wonder how the hell it actually processed the question that quickly.
Lotsa processing power behind it all!