Building a Smart Indian Food Recommender Using TinyLlama, Ollama, and Python
In this tutorial, we’ll build a simple yet smart food recommender system that suggests Indian dishes based on the current humidity and temperature values. The system uses a local language model TinyLlama running via Ollama, and Python’s random.randint()
function to simulate real-time weather.
Why Indian Food?
India has a rich and diverse culinary tradition. Different foods are consumed based on seasons, regional temperatures, and personal preferences. For example, during hot summers, people prefer light, hydrating dishes like buttermilk, lassi, or fruit chaat. In contrast, winter calls for spicier and warmer dishes like parathas, curries, and soups.
Setup Requirements
- Ollama: Install from https://ollama.com
- TinyLlama: Run
ollama pull tinyllama
- Python with basic packages like
random
Understanding the Code
Here’s the core code we’ll use:
import ollama
import random
while True:
humid = random.randint(30, 100)
temp = random.randint(7, 45)
print("Humidity:", humid)
print("Temperature:", temp)
response = ollama.generate(
model="tinyllama",
prompt=f"Humidity is {humid}% and temp is {temp}C. Suggest a list of Indian foods."
)
print(response["response"])
This Python script performs the following:
- Generates a random temperature and humidity level
- Prints the values
- Uses TinyLlama to suggest relevant Indian food
Simulating the Weather
We simulate weather using the function:
random.randint(lower, upper)
For instance, temperature is selected randomly between 7°C and 45°C, while humidity ranges from 30% to 100%. These values represent the typical weather conditions found in various parts of India.
Mathematical Representation
Let us define two variables:
- \( T \): Temperature in Celsius
- \( H \): Humidity in %
We can define a composite index \( S \) as:
$$ S = \frac{T + H}{2} $$
Based on this score, we can categorize the climate and guide food suggestions:
- If \( S > 70 \): Recommend cold, hydrating food
- If \( 50 < S \leq 70 \): Recommend light vegetarian meals
- If \( S \leq 50 \): Recommend hot, spicy, and heavier meals
This formula is optional in the code, as TinyLlama handles reasoning based on the natural language prompt.
Example Output
Let’s say our simulated values are:
- Temperature: 39°C
- Humidity: 90%
The prompt becomes:
"Humidity is 90% and temp is 39C. Suggest a list of Indian foods."
Sample output from TinyLlama might be:
During hot and humid weather, you can enjoy:
- Buttermilk (Chaas)
- Fresh coconut water
- Fruit chaat
- Khichdi with curd
- Watermelon juice
Why Use TinyLlama with Ollama?
TinyLlama is a 1.1B parameter language model trained to run locally. It’s designed for small environments like laptops and edge devices. Ollama provides an easy way to run these models with a simple CLI and local API.
Benefits of This Approach
- No Internet Required: Everything runs offline
- Fast and Lightweight: Ideal for quick prototyping
- Customizable: Prompt engineering makes it versatile
Extending the Project
Here are a few ways you can extend this script:
- Integrate real-time weather data using an API like OpenWeatherMap
- Build a web interface using Flask or Django
- Add regional preferences based on Indian states
- Use voice output with text-to-speech libraries
Video
Conclusion
Combining Python, TinyLlama, and Ollama gives you the power to build offline AI assistants with very little setup. This food recommendation project is just one of many possible applications. You can also adapt this logic for clothing suggestions, travel plans, or even smart home automation based on weather data.
As a closing note, here’s the simplified score formula again:
$$ S = \frac{T + H}{2} $$
Although we didn’t directly use this in our code, it’s a useful concept to build more structured AI rules on top of LLM-generated content.
Happy coding and bon appétit with your AI chef!
References
-
Ollama Official Website
https://ollama.com
Platform for running local language models like TinyLlama easily via CLI and API. -
TinyLlama GitHub Repository
https://github.com/jzhang38/TinyLlama
Lightweight open-source LLM designed for efficient inference and local deployment. -
Python
random
Module Documentation
https://docs.python.org/3/library/random.html
Official Python documentation for the random number generation utilities.