Skip to content

Prompt Engineering & AI Application deployment

banner

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Welcome to our documentation on generative Artificial Intelligence (AI) prompt engineering and application integration in academic research and education.

Learning Objectives

After the lesson, you should be able to:

  • Explain why generative AI matters in education, research, and society
  • Create effective prompts in ChatGPT and other Transformer applications
  • Create your own AI-powered applications using Gradio
  • Understand how and when to use AI assistants

General Productivity Applications

The most likely interaction you will have (or have already had) with generative AI, Transformers, and Large Language Models (LLMs) is with OpenAI's ChatGPT.

Predictive text and auto completion are becoming more common in productivity software. Generative AI powered applications are making their way into everyday software like Word Processors, SMS text messaging, and spreadsheets. LLMs are also being released into productivity software like Microsoft Office 365 w/ CoPilot and Google's Docs and Sheets Workspace.

ChatGPT

Go to our lesson on ChatGPT Prompt Enginering

AI enabled search engines

ChatGPT is integrated into Microsoft's Edge Browser via Bing Chat. Google's competitor, LaMDA, is featured in Bard.

Go to our lesson on Edge Bing

Go to our lesson on Bard

Research Applications

Research applications of generative AI and LLMs are broad. We obviously won't be able to teach all of them here, but hopefully this is an effective jumping off point:

Programming

Go to our lesson on GitHub CoPilot

Go to our lesson on the OpenAI API

Applications

Go to our lesson on OpenAI API Powered Extensions

Go to our lesson on 🤗 HuggingFace Models

Go to our lesson on 🤗 HuggingFace Datasets

Go to our lesson on 🤗 Gradio UI

Educational Applications

Thinking about integrating ChatGPT and OpenAI into your coursework?

There are already a large list of potentail uses for ChatGPT in higher education

Read about OpenAI Educator Considerations

Glossary

Google's Machine Learning Glossary

NVIDIA's Data Science Glossary

BARD - Google's general purpose LLM

Bi-directional Encoder Representations from Transformers (BERT) - is a family of masked-language models introduced in 2018 by researchers at Google , (Devlin et al. )

ChatGPT - OpenAI's general purpose LLM

CoPilot - GitHub (Microsoft/OpenAI) AI co-programmer, natively integrated as an extension in VS Code or GitHub CodeSpaces

Generative Pretrained Transformer (GPT) - are a family of large language models, which was introduced in 2018 by the American artificial intelligence organization OpenAI . (Radford et al. )

GitHub - the most widely used Version Control infrastructure, owned by Microsoft and natively integrated with OpenAI

DALL·E - OpenAI stable diffusion image generation model

HuggingFace - library for open source AI models and apps

Large Language Models (LLMs) - is a language model consisting of a neural network with many parameters (typically billions of weights or more), trained on large quantities of unlabelled text using self-supervised learning ()

Language Models for Dialog Applications (LaMDA) - Google's general purpose LLM

Latent Diffusion Model (LDM) () - machine learning models designed to learn the underlying structure of a dataset by mapping it to a lower-dimensional latent space.

Large Language Model Meta AI (LLAMA) - Meta's general purpose LLM

MidJourney - popular image generation platform (proprietary), which is accessed via Discord

Neural networks - () () - are similar to their biological counter parts, in the sense they have nodes which are interconnected. Rather than string-like neurons and synapses in biology, artificial networks are made of nodes connected by networks of 'weights' which can have positive or negative values.

OpenAI - private company responsible for the first LLMs and ChatGPT

Parameter - () is a value that the model can independently modify as it is trained. Parameters are derived from the training data upon which the model is trained. The number of parameters in the newest LLMs are typically counted in the billions to the trillions.

Segment-Anything (Meta) - is a recently released image and video segmentation technology that allows you to 'clip' a feature from an image with a single click.

Stable Diffusion - computer vision models for creating images from text

Tuning - the process of refining models to become more accurate

Weights - are the value by which a model multiplies another value. Weights are typically determined by the proportional value of the importance of the parameters. Weights signify the value of a specific set of parameters after self-training.

Zero-shot - learning where the AI observes samples from classes which were not observed during training, and needs to predict the class that they belong to.


Last update: 2023-05-27