When AI first hit the scene in its current form, I was dead set against it because of the generative nature of what was being sold to the public. I considered any shortcut to creating art to be offensive to the craft.
But then I realized I could use AI for something that traditional searching was starting to fail at: Research.
Also: Claude AI can do your research and handle your emails now - here's how
With both sides of my writing career (fiction and nonfiction), I have to do quite a bit of research, and Google was becoming a hindrance to that process. Instead of being fed helpful information, I was inundated with ads, sponsored content, and its own AI-based answers (which were rarely helpful).
I first kicked the tires with Opera's Aria, which showed me that AI could actually be helpful. At the same time, I realized that AI also had to be supervised because it could be wrong as easily as it could be right.
I also found another helpful thing about AI in that it could lead me down some fun rabbit holes, where I might discover something really cool to investigate. Ultimately, that journey led me to two AI tools, both of which could be installed and used on Linux for free.
Those two tools have helped me get more done on a daily basis.
Ollama is an open-source AI tool. Its open-source nature is one of the primary reasons I was drawn to it because I know developers around the world can vet its code, and to date, no one has come out to say they've discovered anything untoward in the code.
On top of the open-source nature of Ollama, it's just easy to install and use. And the fact that you can download and use several different LLMs is a bit of delicious icing on an already sweet cake. I can use Cogito, Gemma 3, DeepSeek R1, Llama 3.3, Llama 3.2, Phi 4, QwQ, and many more.
Also: How I feed my files to a local AI for better, more relevant responses
But the main reason I prefer Ollama over any other AI tool is that it can be used locally, which means my queries aren't accessible by a third party. I like that level of privacy.
But how does Ollama help me get things done? First, there's the prompts library, which gives you access to several quick prompts and even allows you to create custom prompts. One prompt I often type is "Do a deep dive into the following topic and make sure to explore any relevant side topics:". Instead of always having to type that prompt, I can create a quick prompt for it, so all I have to type is the subject matter. On top of that, I don't have to remember to prompt Ollama to explore relevant topics.
Also: How to run DeepSeek AI locally to protect your privacy - 2 easy ways
I create that quick prompt within the library so I can easily call upon it whenever I need. This saves me time and ensures I always get the prompt right every time. I don't have to think about what the prompt needs to say, and I can make the prompt as easy or complicated as I need.
Creating quick prompts in Msty is a sure-fire way to make your daily work a bit more efficient.
The Prompts Library is very helpful, especially when I have more complex prompts that I regularly type.
Also: I tried Sanctum's local AI app, and it's exactly what I needed to keep my data private
Next, there are the knowledge stacks, which allow me to add documents of my own (which always remain local) so the LLM I've chosen can use that information as a source. Let's say I've written several articles on a single subject and want to use their combined information to answer some questions. I could go back through and read everything in that series, or I could add them to a knowledge stack and then ask my question(s). Ollama will search through every document added to the stack and use that information for its response.
It's really helpful.
There's also a desktop app available for Perplexity. The desktop app is pretty much the same as using Perplexity.ai via your browser, but I do find it a bit more efficient to use.
There are two main features that help me with my daily tasks: Search and Research.
Also: How I made Perplexity AI the default search engine in my browser (and why you should too)
If I want to do a standard search with Perplexity, I click the Search button, type my query, and hit Enter on my keyboard. If, on the other hand, I need a deeper dive into a subject, I hit Research and type my query.
One thing you have to know about the Research option is that it truly does a deep dive and can take up to 30 minutes to deliver your results. But when you need to really get into a subject, that feature is a must. The cool thing about the Research is you can click Tasks, and while it does its thing, it displays the sources used for the deep dive.
Watching Perplexity do its thing can be fascinating.
One thing to keep in mind with Research is that the free version limits you to the number of queries you can run per day. You can upgrade to the Professional plan for unlimited free searches and 300+ Pro searches per day. The Professional plan is$20 per month.
Another very handy feature in Perplexity is Spaces. With this feature, I can create custom spaces for different topics. I can then switch spaces, run a query, and know that the query will be isolated to that space, meaning when I want to recall that query, I only have to switch to the space and find it. That makes it much easier to keep track of previous queries without having to scour through a long list.
Also: I tried Perplexity's assistant, and only one thing stops it from being my default phone AI
Between those two AI tools on my Linux desktop, I am able to get much more done on a daily basis. I would highly recommend you give one (or both) of these a try.
Get the morning's top stories in your inbox each day with ourTech Today newsletter.