You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on May 20, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: docs/guides/python/serverless-ai-api.mdx
+9-4Lines changed: 9 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -49,7 +49,7 @@ Before we begin, make sure you have the following:
49
49
preferred cloud.
50
50
</Note>
51
51
52
-
## Project Setup - Your First Nitric App
52
+
## Project Setup
53
53
54
54
Nitric projects have a simple structure: your code (services, functions, etc.) lives in a directory (by default called `services/`), and a `nitric.yaml` config file describes high-level settings (like which files are services and what runtime to use). We'll use the Nitric CLI to scaffold a new project:
55
55
@@ -63,14 +63,14 @@ This will create a new folder `my-summarizer` with a Python starter template. (Y
63
63
64
64
If none of the templates suit your needs, you can always create your own.
65
65
66
-
2.**Install dependencies:** Navigate into the project and install the requirements. The template uses Pipenv, so you can do:
66
+
2.**Install dependencies:** Navigate into the project and install the requirements. The template uses uv, so you can do:
67
67
68
68
```bash
69
69
cd my-summarizer
70
70
uv sync
71
71
```
72
72
73
-
This will install Nitric's Python SDK (and any other deps). If you used a different template, change this step to install dependencies accordingly.
73
+
This will install Nitric's Python SDK (and any other dependencies). If you used a different template, change this step to install dependencies accordingly.
74
74
75
75
3.**Explore the scaffold:** The generated project should have a structure like:
76
76
@@ -126,6 +126,9 @@ This will fetch the API key from your environment. (If you prefer, Nitric also h
126
126
127
127
```python
128
128
# Create OpenAI client
129
+
# This could be a local model during development if you have Ollama or similar installed.
130
+
# For example, if you have Ollama running locally, you could use:
# Call OpenAI API to get summary (this is a blocking call, for demo simplicity)
149
152
try:
150
153
response = client.chat.completions.create(
154
+
# This model could be replaced with another of your choice.
155
+
# Using tools like Ollama, it could even be a local model.
151
156
model="gpt-3.5-turbo",
152
157
messages=[{"role": "user", "content": f"Summarize the following text:\n\n{text_to_summarize}"}],
153
158
max_tokens=50
@@ -315,7 +320,7 @@ curl -X POST https://<<your-api-url>>/summarize \
315
320
316
321
_(Make sure to put your actual endpoint URL in place of `<<your-api-url>>`._
317
322
318
-
You should get a JSON response with a summary, similar to what you saw locally. Congratulations - your AI-powered summarization service is now live on the internet, running serverlessly! 🎉
323
+
You should get a JSON response with a summary, similar to what you saw locally. Congratulations - your AI-powered summarization service is now live on the internet, running completely on serverless infrastructure! 🎉
0 commit comments