with Mistral AI and Hugging Face?
Emit new event when a new batch job is completed. See the Documentation
Emit new event when a new batch job fails. See the Documentation
Emit new event when a new AI model is registered or becomes available. See the Documentation
Create a new batch job, it will be queued for processing. See the Documentation
Want to have a nice know-it-all bot that can answer any question?. This action allows you to ask a question and get an answer from a trained model. See the docs
This task reads some image input and outputs the likelihood of classes. This action allows you to classify images into categories. See the docs
Download a batch job results file to the /tmp directory. See the Documentation
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
mistral_ai: {
type: "app",
app: "mistral_ai",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.mistral.ai/v1/models`,
headers: {
Authorization: `Bearer ${this.mistral_ai.$auth.api_key}`,
"content-type": `application/json`,
},
})
},
})
The Hugging Face API provides access to a vast range of machine learning models, primarily for natural language processing (NLP) tasks like text classification, translation, summarization, and question answering. It lets you leverage pre-trained models and fine-tune them on your data. Using the API within Pipedream, you can automate workflows that involve language processing, integrate AI insights into your apps, or respond to events with AI-generated content.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
hugging_face: {
type: "app",
app: "hugging_face",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://huggingface.co/api/whoami-v2`,
headers: {
Authorization: `Bearer ${this.hugging_face.$auth.access_token}`,
},
})
},
})