0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

Building LLM Powered Applications

Posted at

Building LLM Powered Applications
https://learning.oreilly.com/library/view/building-llm-powered/9781835462317/

github

Building LLM Powered Applications
https://github.com/PacktPublishing/Building-LLM-Powered-Applications
hamzafarooq
https://github.com/hamzafarooq/building-llm-applications-from-scratch

Table of contents

Preface
Who this book is for
What this book covers
To get the most out of this book
Get in touch
Introduction to Large Language Models
What are large foundation models and LLMs?
AI paradigm shift – an introduction to foundation models
Under the hood of an LLM
Most popular LLM transformers-based architectures
Early experiments
Introducing the transformer architecture
Training and evaluating LLMs
Training an LLM
Model evaluation
Base models versus customized models
How to customize your model
Summary
References
LLMs for AI-Powered Applications
How LLMs are changing software development
The copilot system
Introducing AI orchestrators to embed LLMs into applications
The main components of AI orchestrators
LangChain
Haystack
Semantic Kernel
How to choose a framework
Summary
References
Choosing an LLM for Your Application
The most promising LLMs in the market
Proprietary models
GPT-4
Gemini 1.5
Claude 2
Open-source models
LLaMA-2
Falcon LLM
Mistral
Beyond language models
A decision framework to pick the right LLM
Considerations
Case study
Summary
References
Prompt Engineering
Technical requirements
What is prompt engineering?
Principles of prompt engineering
Clear instructions
Split complex tasks into subtasks
Ask for justification
Generate many outputs, then use the model to pick the best one
Repeat instructions at the end
Use delimiters
Advanced techniques
Few-shot approach
Chain of thought
ReAct
Summary
References
Embedding LLMs within Your Applications
Technical requirements
A brief note about LangChain
Getting started with LangChain
Models and prompts
Data connections
Memory
Chains
Agents
Working with LLMs via the Hugging Face Hub
Create a Hugging Face user access token
Storing your secrets in an .env file
Start using open-source LLMs
Summary
References
Building Conversational Applications
Technical requirements
Getting started with conversational applications
Creating a plain vanilla bot
Adding memory
Adding non-parametric knowledge
Adding external tools
Developing the front-end with Streamlit
Summary
References
Search and Recommendation Engines with LLMs
Technical requirements
Introduction to recommendation systems
Existing recommendation systems
K-nearest neighbors
Matrix factorization
Neural networks
How LLMs are changing recommendation systems
Implementing an LLM-powered recommendation system
Data preprocessing
Building a QA recommendation chatbot in a cold-start scenario
Building a content-based system
Developing the front-end with Streamlit
Summary
References
Using LLMs with Structured Data
Technical requirements
What is structured data?
Getting started with relational databases
Introduction to relational databases
Overview of the Chinook database
How to work with relational databases in Python
Implementing the DBCopilot with LangChain
LangChain agents and SQL Agent
Prompt engineering
Adding further tools
Developing the front-end with Streamlit
Summary
References
Working with Code
Technical requirements
Choosing the right LLM for code
Code understanding and generation
Falcon LLM
CodeLlama
StarCoder
Act as an algorithm
Leveraging Code Interpreter
Summary
References
Building Multimodal Applications with LLMs
Technical requirements
Why multimodality?
Building a multimodal agent with LangChain
Option 1: Using an out-of-the-box toolkit for Azure AI Services
Getting Started with AzureCognitiveServicesToolkit
Setting up the toolkit
Leveraging a single tool
Leveraging multiple tools
Building an end-to-end application for invoice analysis
Option 2: Combining single tools into one agent
YouTube tools and Whisper
DALL·E and text generation
Putting it all together
Option 3: Hard-coded approach with a sequential chain
Comparing the three options
Developing the front-end with Streamlit
Summary
References
Fine-Tuning Large Language Models
Technical requirements
What is fine-tuning?
When is fine-tuning necessary?
Getting started with fine-tuning
Obtaining the dataset
Tokenizing the data
Fine-tuning the model
Using evaluation metrics
Training and saving
Summary
References
Responsible AI
What is Responsible AI and why do we need it?
Responsible AI architecture
Model level
Metaprompt level
User interface level
Regulations surrounding Responsible AI
Summary
References
Emerging Trends and Innovations
The latest trends in language models and generative AI
GPT-4V(ision)
DALL-E 3
AutoGen
Small language models
Companies embracing generative AI
Coca-Cola
Notion
Malbek
Microsoft
Summary
References
Other Books You May Enjoy
Index

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?