[ad_1]
Inspired by OpenAI, and Google, Perplexity AI has launched Deep Research, a tool for autonomously conducting in-depth research and analysis. The feature performs multiple searches, reviews hundreds of sources, and compiles findings into comprehensive reports. It is free for everyone—up to 5 queries per day for non-subscribers and 500 queries per day for Pro users.
According to Perplexity, Deep Research can handle expert-level tasks across finance, marketing, product research, and other domains. “We believe everyone should have access to powerful research tools,” the company stated.
“Excited to introduce the Perplexity Deep Research Agent: available for free to all users. Paid users only need to pay $20/mo to access an expert level researcher on any topic for 500 daily queries, and need to wait less than three minutes for getting a full research report,” said Perplexity AI co-founder and CEO Aravind Srinivas in a post on X.
On the other hand, OpenAI’s deep research is available as part of the ChatGPT Pro plan, which costs $200 per month. This plan includes a limit of 100 queries per month.
The feature is initially available on the web and will soon be rolled out to iOS, Android, and Mac. Users can access it by selecting “Deep Research” from the mode selector before submitting a query.
Perplexity claims its Deep Research model scores 21.1% on Humanity’s Last Exam, outperforming Gemini Thinking, o3-mini, o1, DeepSeek-R1, and other top models. It also records 93.9% accuracy on the SimpleQA benchmark, surpassing other leading AI models.
The company positions Deep Research as a tool for various applications, including financial analysis, market research, and personal consulting. “Deep Research takes question answering to the next level,” Perplexity stated. The AI system is designed to function similarly to human researchers, refining its understanding as it gathers more data.
Deep Research operates in three phases, including iterative research, report writing, and exportability. The AI searches for information, evaluates sources, refines its approach, and synthesises findings into a report. Users can then export reports as PDFs, documents, or Perplexity Pages for sharing.
The company has also optimised Deep Research for speed. Notably, the company recently announced that its in-house model, Sonar, will be available to all Pro users on the platform. Now, users with the Perplexity Pro plan can make Sonar the default model by changing settings.
Sonar is built on top of Meta’s open-source Llama 3.3 70B. It is powered by Cerebras Inference, which claims to be the world’s fastest AI inference engine. The model can produce 1200 tokens per second.
[ad_2]
Source link