Edit this page

HuggingFace Serverless Model Inference


References

Loading

The library runs on Node.js and browser environments and is available on npm @huggingface/inference. To load it in a browser, you can use ES modules via skypack.dev

Access Token

Using an API key is optional to get started, however you will be rate limited eventually. To get a a new token go to setting/tokens. Please note that this access token is intended to be kept private and only used here for experimental purposes.


Table Question Answering

// Replace the tableQResps code block with:
const tableQResps = await hf.tableQuestionAnswering({
  model: "google/tapas-base-finetuned-wtq",
  inputs: {
    query: tableQuestion,
    table: formatTableData(tableData)
  }
});

view(htl.html`<div class="answer">
  The best performing stock today was: "${tableQResps.answer}"
</div>`);

Text to Image

const image = await hf.textToImage({
  inputs: 'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',
  model: 'stabilityai/stable-diffusion-2',
  parameters: {
    negative_prompt: 'blurry',
  }
});

// Convert blob to URL and display
const imageUrl = URL.createObjectURL(image);
const imgElement = document.createElement('img');
imgElement.src = imageUrl;
imgElement.style.maxWidth = '100%';
display(imgElement);

// Clean up the object URL when done
imgElement.onload = () => URL.revokeObjectURL(imageUrl);