Skip to content

[Inference Providers] Featherless release blogpost #2883

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 17 commits into from
Jun 12, 2025

Conversation

SBrandeis
Copy link
Contributor

@SBrandeis SBrandeis commented Jun 3, 2025

image

Copy link
Contributor

@merveenoyan merveenoyan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

smol nit and needs banner otherwise lgtm :)

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some nits

@SBrandeis SBrandeis requested a review from Wauplin June 3, 2025 10:59
Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left some more comments. Except from that + banner, looks good to me :)

Copy link
Member

@Vaibhavs10 Vaibhavs10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yay!

Copy link
Member

@Vaibhavs10 Vaibhavs10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SBrandeis SBrandeis force-pushed the inference-providers-featherless branch from bdb8e4b to 4a44dec Compare June 4, 2025 10:03
Copy link
Contributor

@hanouticelina hanouticelina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very nice! thank you

@SBrandeis SBrandeis changed the title blogpost: featherless as a provider blogpost: featherless + groq as a provider Jun 11, 2025
@SBrandeis SBrandeis force-pushed the inference-providers-featherless branch from e05be78 to cfda62a Compare June 11, 2025 10:18

The following example shows how to use DeepSeek-R1 using Featherless AI as the inference provider. You can use a [Hugging Face token](https://huggingface.co/settings/tokens) for automatic routing through Hugging Face, or your own Featherless AI API key if you have one.

Install `huggingface_hub` from source (see [instructions](https://huggingface.co/docs/huggingface_hub/installation#install-from-source)). Official support will be released soon in version v0.33.0.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to check and update this, since @Wauplin will make a patch today/ tomorrow.


Install `huggingface_hub` from source (see [instructions](https://huggingface.co/docs/huggingface_hub/installation#install-from-source)). Official support will be released soon in version v0.33.0.

```python
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note to self: to add a snippet for groq too when there's a mapping available

@SBrandeis SBrandeis force-pushed the inference-providers-featherless branch from c538688 to 23697cb Compare June 11, 2025 12:50
@SBrandeis SBrandeis changed the title blogpost: featherless + groq as a provider blogpost: featherless as a provider Jun 11, 2025
@SBrandeis SBrandeis marked this pull request as ready for review June 11, 2025 13:09
@SBrandeis SBrandeis changed the title blogpost: featherless as a provider [Inference Providers] Featherless release blogpost Jun 11, 2025
Copy link
Collaborator

@burtenshaw burtenshaw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, just left some nits and docs links

Co-authored-by: burtenshaw <[email protected]>
Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small changes in snippets (we now load from env var in snippets) + use InferenceClient

@@ -6148,3 +6148,17 @@
- training
- partnerships
- announcement

- local: inference-providers-featherless
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- local: inference-providers-featherless
- local: featherless-ai

(maybe) shorter

Co-authored-by: Pedro Cuenca <[email protected]>
@SBrandeis SBrandeis merged commit bb1017e into main Jun 12, 2025
1 check passed
@SBrandeis SBrandeis deleted the inference-providers-featherless branch June 12, 2025 14:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants