Retour aux articles
IAHugging Face Blog

Leveraging Hugging Face for complex generative AI use cases

Back to Articles Leveraging Hugging Face for complex generative AI use casess Published July 1, 2023 Update on GitHub Upvote - Jeff Boudier jeffboudier Follow Waseem AlShikh wassemgtk Follow guest In th...

Le flux RSS ne fournissait qu'un extrait. FlowMarket a récupéré le contenu public disponible depuis la page originale, sans contourner les contenus réservés.

Leveraging Hugging Face for complex generative AI use casess

Jeff Boudier
Waseem AlShikh

In this conversation, Jeff Boudier asks Waseem Alshikh, Co-founder and CTO of Writer, about their journey from a Hugging Face user, to a customer and now an open source model contributor.

  • why was Writer started?
  • what are the biggest misconceptions in Generative AI today?
  • why is Writer now contributing open source models?
  • what has been the value of the Hugging Face Expert Acceleration Program service for Writer?
  • how it Writer approaching production on CPU and GPU to serve LLMs at scale?
  • how important is efficiency and using CPUs for production?

If you’re interested in Hugging Face Expert Acceleration Program for your company, please contact us here - our team will contact you to discuss your requirements!

More Articles from our Blog

Investing in Performance: Fine-tune small models with LLM insights - a CFM case study

  • +3

Expert Support case study: Bolstering a RAG app with LLM-as-a-Judge

Community

· Sign up or log in to comment

Besoin d'un workflow n8n ou d'aide pour l'installer ?

Après la veille, passez à l'action : trouvez un template n8n ou un créateur capable de l'adapter à vos outils.

Source

Hugging Face Blog - huggingface.co

Voir la publication originale