rmauro.dev {blog}
  • Home
  • Privacy Policy
  • RSS
  • Feedly

Conference API Sample

  • Ricardo Mauro

Ricardo Mauro

Jun 1, 2019

This is a sample API hosted by Microsoft.

https://conferenceapi.azurewebsites.net/?format=json

Love Discord?

Share        
Run Docker on WSL2 Without Docker Desktop 🐳

Run Docker on WSL2 Without Docker Desktop 🐳

Let’s install Docker inside a WSL2 Linux distro, configure it to run without sudo, set up auto-start on launch, and enable log rotation to keep things clean—all without the bloated Docker Desktop. Docker Desktop isn’t just unnecessary—it’s heavy, intrusive, and now comes with licensing restrictions.
Jun 29, 2025 2 min read
Running llama.cpp in Docker on Raspberry Pi

Running llama.cpp in Docker on Raspberry Pi

Running large language models on a Raspberry Pi isn’t just possible—it’s fun. Whether you're a hacker exploring local AI, a developer prototyping LLM workflows, or just curious about how far you can push a Pi, this tutorial is for you. We’ll show you how
Jun 14, 2025 2 min read
Running LLM llama.cpp Natively on Raspberry Pi
Featured

Running LLM llama.cpp Natively on Raspberry Pi

For developers and hackers who enjoy squeezing maximum potential out of compact machines, getting a large language model like llama.cpp running natively on a Raspberry Pi is a rewarding challenge. This guide walks you through compiling llama.cpp from source, downloading a model, and running inference - all on
Jun 14, 2025 1 min read
rmauro.dev {blog} © 2025
  • Github
  • Twitter (X)
  • Discord
  • LinkedIn
  • Feedly
Powered by Ghost