In the previous article, we trained a predictive model in Python to forecast product demand based on daily features like store ID, price, discounts, and promotions.
We also saw how to operationalize that model locally using Azure Functions — turning our script into a live HTTP endpoint.
Now it’s time to take the next step:
We’ll deploy the function to Azure’s cloud environment so that it can be used in real-world tools like Power BI or Power Automate.
🚀 Overview: From Local to Cloud in 5 Steps
- Create the infrastructure on Azure: resource group, storage account, function app plan, and the function app itself.
- Prepare the project folder locally, with the trained
.pklmodel and the function code. - Publish the function to Azure using
func azure functionapp publish. - Get the function URL and authentication key from the Azure portal.
- Test the endpoint using curl and Power BI.
💡 What the Azure Function Code Does
This function receives a JSON array, transforms the data to match the model’s expectations, predicts demand, and returns the result.
import azure.functions as func
import pandas as pd
import joblib
import json
import logging
model = joblib.load("modelo_previsao_vendas.pkl")
feature_cols = model.feature_names_in_
def transform_input(raw_json):
df = pd.DataFrame(raw_json)
df_encoded = pd.get_dummies(df)
for col in feature_cols:
if col not in df_encoded.columns:
df_encoded[col] = 0
df_encoded = df_encoded[feature_cols]
return df_encoded
def main(req: func.HttpRequest) -> func.HttpResponse:
try:
raw_json = req.get_json()
df_ready = transform_input(raw_json)
preds = model.predict(df_ready)
return func.HttpResponse(
json.dumps({"predictions": preds.tolist()}),
mimetype="application/json"
)
except Exception as e:
logging.error(f"Erro: {e}")
return func.HttpResponse(f"Erro: {str(e)}", status_code=500)
🔐 Authentication with Function Key
Azure protects HTTP-triggered functions using a function-level API key by default. You can find it in the Azure portal:
- Go to your Function App.
- Select your function (e.g.,
PredictDemand). - Click “Get Function URL”.
- Copy the full URL — it includes the
code=...parameter with the key.
⚠️ About Security
Passing the key via URL is simple and convenient, but not the most secure approach in production.
Other, more robust options include:
- Azure AD authentication (with identity tokens)
- Restricting IP ranges
- Using an API gateway or reverse proxy
For this tutorial, using the function key is perfectly fine.
📊 Calling the Azure Function from Power BI
Now let’s call this cloud function directly from Power BI using Power Query (M). We’re sending the exact same input data we used in the previous article to confirm that the result matches.
✅ Final Power Query Code
let
url = "https://forecast-func-soigzb.azurewebsites.net/api/PredictDemand?code=API_KEY",
payload = Text.ToBinary(
"[{ ""Store ID"": 101, ""Product ID"": 2001, ""Category"": ""Electronics"", ""Region"": ""North"",
""Price"": 29.99, ""Discount"": 0.10, ""Weather Condition"": ""Sunny"", ""Holiday/Promotion"": ""Yes"",
""Competitor Pricing"": 31.50, ""Seasonality"": ""High"", ""DayOfWeek"": 2, ""Month"": 7, ""Year"": 2025 }]"
),
response = Web.Contents(url,
[
Headers = [#"Content-Type" = "application/json"],
Content = payload
]
),
parsed = Json.Document(response),
output = Record.ToTable(parsed),
#"Extracted Values" = Table.TransformColumns(output, {"Value", each Text.Combine(List.Transform(_, Text.From)), type text}),
#"Changed Type" = Table.TransformColumnTypes(#"Extracted Values",{{"Value", type number}})
in
#"Changed Type"
This script:
- Sends the JSON payload via HTTP POST to the Azure Function.
- Parses the JSON response and extracts the prediction.
- Converts the value to a numeric column so it can be used in visualizations.
✅ Output
The function returned:
{"predictions": [4.885143592336774]}
Just like it did locally — confirming that the model, transformation logic, and output structure are fully aligned.
You may also see the results directly in Power Query as per below screenshot. It’s very easy to integrate your existing data workflows into a function. You could, for example, consume the API as a Power Query function and get live demand forecasts. This would allow you to build a stockout model, a simple rule like, “if the current inventory level is less than or equal to the demand forecast, we need to take action against a stockout!”.

Talking about action – enter Power Automate!
🤖 Consuming the Azure Function from SharePoint via Power Automate
So far, we’ve seen how to test our demand forecast function locally, publish it to Azure, and call it from Power BI.
But what if the input data comes from a business process — like a form submission or a new product listing in SharePoint?
In this section, we’ll show how to connect a SharePoint list to our Azure Function using Power Automate,
so that predictions can be made automatically whenever new data is added.
📄 1. Create the SharePoint List
Create a SharePoint list with the following fields:
| Column Name | Type | Sample Value |
|---|---|---|
| Store ID | Number | 101 |
| Product ID | Number | 2001 |
| Category | Single line of text | Electronics |
| Region | Single line of text | North |
| Price | Number | 29.99 |
| Discount | Number | 0.10 |
| Weather Condition | Single line of text | Sunny |
| Holiday/Promotion | Single line of text | Yes |
| Competitor Pricing | Number | 31.50 |
| Seasonality | Single line of text | High |
| DayOfWeek | Number | 2 |
| Month | Number | 7 |
| Year | Number | 2025 |
It will look like this:

🔁 2. Build the Flow in Power Automate
Create an automated flow with the trigger:
When an item is created
In the SharePoint list you just created:

Then add an HTTP action to call the Azure Function.

🔧 Configuration
- Method: POST
- URL:
https://forecast-func-soigzb.azurewebsites.net/api/PredictDemand?code=API_KEY - Headers:
{ "Content-Type": "application/json" } - Body:
[ { "Store ID": @{triggerOutputs()?['body/StoreID']}, "Product ID": @{triggerOutputs()?['body/ProductID']}, "Category": "@{triggerOutputs()?['body/Category']}", "Region": "@{triggerOutputs()?['body/Region']}", "Price": @{triggerOutputs()?['body/Price']}, "Discount": @{triggerOutputs()?['body/Discount']}, "Weather Condition": "@{triggerOutputs()?['body/WeatherCondition']}", "Holiday/Promotion": "@{triggerOutputs()?['body/HolidayPromotion']}", "Competitor Pricing": @{triggerOutputs()?['body/CompetitorPricing']}, "Seasonality": "@{triggerOutputs()?['body/Seasonality']}", "DayOfWeek": @{triggerOutputs()?['body/DayOfWeek']}, "Month": @{triggerOutputs()?['body/Month']}, "Year": @{triggerOutputs()?['body/Year']} } ]
⚠️ Tip: Double-check your internal SharePoint field names using the “Get item” action if necessary.
📥 3. Parse and Use the Result
After the HTTP action, add:
✅ Parse JSON

Use this schema:
{
"type": "object",
"properties": {
"predictions": {
"type": "array",
"items": {
"type": "number"
}
}
}
}
💬 Compose
Use this expression to get the predicted value:

@first(body('Parse_JSON')?['predictions'])
You can store this result in another SharePoint column, use it in conditions, or send a notification.
✅ Result
Now every time a new item is added to the SharePoint list, Power Automate:
- Extracts the fields
- Sends them to the Azure Function
- Receives the forecasted demand
- Stores or uses the result automatically

🧠 Summary
By deploying our model as an Azure Function and integrating it with tools like Power BI and Power Automate, we’ve turned a Jupyter notebook into an enterprise-ready predictive service.
- In Power BI, we can visualize the forecasts
- In Power Automate, we can trigger predictions from business events
- The same function logic is reused across all tools — with no code duplication
Next up? You might want to:
- Add error handling and retry logic
- Introduce versioning for models
- Log predictions to a central database
- Secure the endpoint with OAuth instead of query keys
- Blow your mind with the incredible things you can do!
Let’s make AI useful — and usable — in real life!




