Operationalize Your Machine Learning Model with Azure Functions

In the previous article, I demonstrated how to train a machine learning model in Python to forecast daily product demand for a retail business. Now it’s time to take that model from a notebook into a real-world environment — where it can receive new data and return predictions on demand.

How can we do that?
And more importantly: how can we automatically trigger those forecasts from tools like Power BI, Power Apps, or Power Automate?

The answer is: Azure Functions.
And the good news is — it’s actually quite simple to implement and scale.

🧱 Step 1 – Create the Local Project Folder

First, let’s set up the local environment where we’ll build and test the solution.

We’ll use Visual Studio Code (VS Code) — a free and lightweight IDE developed by Microsoft. It integrates seamlessly with the Azure Functions runtime and supports Python development.

mkdir forecast-api
cd forecast-api

This creates a new folder named forecast-api and navigates into it. This will be the root of our project — containing all function code, configuration files, and the trained model (.pkl).

Type the following command on a PowerShell terminal to open forecast-api folder on Visual Studio Code:

code .

✅ Tip: If you don’t have the Azure Functions extension installed in VS Code yet, now’s a good time to do it. Click on the Extensions icon on the left and on the search bar type: Azure Functions. After that, follow the installation instructions.

🐍 Before You Start: Make Sure Python is Installed

Before creating a virtual environment, make sure you have Python 3.10 or later installed.

py --version

If not installed, download it from:
👉 python.org/downloads

✅ Be sure to select the “Add Python to PATH” option during setup.

🧪 Step 2 – Create a Virtual Environment

To keep project dependencies isolated and reproducible, we’ll create a virtual environment.

py -m venv .venv

This creates a .venv folder with a dedicated Python interpreter and pip installation.

🧠 Step 3 – Activate the Virtual Environment

On Windows:

.venv\Scripts\activate

On macOS / Linux:

source .venv/bin/activate

Once activated, your terminal should show (.venv). From now on, any pip install will only affect this project.

pip install azure-functions pandas joblib scikit-learn

This command will install

the required Python packages

It downloads and installs these four libraries only inside your .venv, keeping your project dependencies isolated from your system Python.

⚙️ Step 4 – Scaffold the Azure Function App

We’re now ready to initialize the Azure Functions project. Run:

func init . --worker-runtime python

This creates the base structure and configuration needed to build Azure Functions in Python, specifically it will create the core configuration files:

  • host.json: app-level settings
  • local.settings.json: secrets (for local only)
  • .vscode/: VS Code integration
  • requirements.txt: Python dependencies

🚀 Step 5 – Create the HTTP Function

With the classic Python programming model (pre-v2), each function lives inside its own folder and consists of two key files:

  • __init__.py – your Python logic
  • function.json – metadata and bindings

Use the Azure Functions Core Tools CLI to scaffold the HTTP function:

func new --name PredictDemand --template "HTTP trigger" --authlevel "anonymous"

This command creates a new folder called PredictDemand/ containing the files needed for an HTTP-triggered function.

✅ Tip: If you see a folder like PredictDemand/ with __init__.py and function.json, that means you’re using the classic model — which works perfectly with Python 3.x and is still widely supported.

📁 Final Folder Structure (so far)

forecast-api/
├── .venv/
├── PredictDemand/
│   ├── __init__.py              ← your function logic
│   └── function.json            ← HTTP trigger bindings
├── modelo_previsao_vendas.pkl  ← trained ML model
├── requirements.txt            ← Python dependencies
├── host.json                   ← global app config
├── local.settings.json         ← local env settings
├── ForecastDemand.ipynb        ← training notebook (optional)
├── generate_payload.py         ← helper script to build JSON
└── predict_sample.json         ← test payload for curl/Postman

🧠 Step 6 – Load the Model and Handle Predictions

Now we’ll write the Python logic that powers our prediction API.

Open the PredictDemand/__init__.py file and replace its contents with the following code:

import azure.functions as func
import joblib
import pandas as pd
import json
import logging

# Load the trained model when the function app starts
model = joblib.load("modelo_previsao_vendas.pkl")
feature_cols = model.feature_names_in_

def transform_input(raw_json):
    df = pd.DataFrame(raw_json)
    df_encoded = pd.get_dummies(df)

    # Ensure all expected features are present
    for col in feature_cols:
        if col not in df_encoded.columns:
            df_encoded[col] = 0

    return df_encoded[feature_cols]

def main(req: func.HttpRequest) -> func.HttpResponse:
    try:
        raw_json = req.get_json()
        df_ready = transform_input(raw_json)
        preds = model.predict(df_ready)

        return func.HttpResponse(
            json.dumps({"predictions": preds.tolist()}),
            mimetype="application/json"
        )

    except Exception as e:
        logging.error(f"Erro: {e}")
        return func.HttpResponse(f"Erro: {str(e)}", status_code=500)

Next, make sure you have a function.json file inside the PredictDemand/ folder with the following content:

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "authLevel": "anonymous",
      "type": "httpTrigger",
      "direction": "in",
      "name": "req",
      "methods": [ "post" ]
    },
    {
      "type": "http",
      "direction": "out",
      "name": "$return"
    }
  ]
}

This file tells Azure Functions to expose the main() method via HTTP POST, with anonymous access enabled.

✅ Tip: This setup works with the traditional Azure Functions programming model and is fully compatible with Python 3.x on Windows, macOS, or Linux.

This declares that our function accepts anonymous HTTP POST requests and returns an HTTP response.

📦 Step 7 – Place the Trained Model in the Project

Make sure that the file modelo_previsao_vendas.pkl — which we trained in the previous article — is located in the root folder of your project (forecast-api/).

This is the file that will be loaded by joblib.load() inside your PredictDemand/__init__.py function. Since the function runs from the root folder, it can directly access modelo_previsao_vendas.pkl using a relative path.

▶️ Step 8 – Run and Test Locally

Start the Azure Functions host locally by running:

func start

You should see something like:

Functions:

    PredictDemand: [POST] http://localhost:7071/api/PredictDemand

That means your function is up and running! You can now send POST requests to test it.

Here’s how to do it from PowerShell using Invoke-RestMethod:

Invoke-RestMethod -Method Post `
  -Uri http://localhost:7071/api/PredictDemand `
  -ContentType "application/json" `
  -InFile "predict_sample.json"

✅ Make sure your predict_sample.json file contains a list of dictionaries matching the expected features. For example:

[
  {
    "Store ID": 101,
    "Product ID": 2001,
    "Category": "Electronics",
    "Region": "North",
    "Price": 29.99,
    "Discount": 0.10,
    "Weather Condition": "Sunny",
    "Holiday/Promotion": "Yes",
    "Competitor Pricing": 31.50,
    "Seasonality": "High",
    "DayOfWeek": 2,
    "Month": 7,
    "Year": 2025
  }
]

Alternatively, you can use Postman to send a JSON request with the required data fields and view the response.

✅ Tip: If your model uses one-hot encoding, ensure that the input includes all required columns or that your function handles missing dummies gracefully.

🔍 Understanding the Prediction

The function responded with:

{
  "predictions": [4.88514359233677]
}

This number represents the model’s prediction: in this scenario, it expects around 4.88 units sold for the specified product, store, and conditions.

📈 Why is this useful?

This seemingly simple number unlocks several powerful use cases:

  • It can be used to detect stockout risks — if inventory is lower than the predicted demand, an alert can be triggered.
  • It helps optimize pricing and promotions, by simulating different discount or competitor pricing scenarios.
  • It can be integrated into dashboards and reports in Power BI.
  • It can trigger automated workflows in Power Automate, such as restocking alerts or order generation.

✅ Real-time forecasting becomes actionable

With this API in place, any tool that can make HTTP requests can now trigger real-time predictions — including Power BI, Power Apps, or your ERP.

Want to see how that works? Stay tuned for the next part.

🧠 Recap

At this point, you’ve built a fully functional prediction API using:

  • Python and joblib to load and run your trained ML model
  • Azure Functions to create a lightweight, serverless HTTP endpoint
  • Pandas to handle incoming data

You can now trigger this model from any tool or system capable of making HTTP requests — including Power BI, Power Automate, or a web application.

🚀 What’s Next?

In the next article, we’ll publish this function to Azure and connect it with other systems — turning your local API into a production-ready cloud service.

We’ll also:

  • Secure it with an API key
  • Use it from Power BI via Power Query and Web.Contents()
  • Create alert-based automation with Power Automate

📦 Ready to go from local to cloud? Stay tuned for Part 3!

Share your love
Nuno Nogueira
Nuno Nogueira
Articles: 28

Leave a Reply

Your email address will not be published. Required fields are marked *