SANS

GERARD

Developer Evangelist

Developer Evangelist

International Speaker

Spoken at 168 events in 37 countries

Serverless Training

Serverless Training

What if an AI could do your job?

How long do you think it can take?

AI is likely to overtake humans in the next five years

– Elon Musk

GitHub Copilot

Technical Preview

JavaScript

TypeScript

Go

Ruby

Python

Just enough to start...

Provide some code...

Use comments

Or cherry-pick alternatives

GitHub Copilot Architecture

OpenAI

GitHub

Visual Code

Copilot Service

Codex

OpenAI

Greg Brockman

Elon Musk

Sam Altman

Co-Founders

Our mission is to build safe

artificial general intelligence (AGI)

that benefits all humanity.

2015

Introducing OpenIA

Initially a non-profit company, later on becomes "capped-profit", raises $1B.

2018

GPT is released

First Generative Pre-trained Transformer model with 117M parameters.

2019

GPT-2 is released

New model with 1.5B parameters.

2020

GPT-3 and OpenAI API are released

Largest model to date with 175B parameters.

2021

GitHub Copilot and Codex are released

Largest code model to date with 12B parameters.

NLP Models comparison

GPT-1

BERT

Parameters (Bilions)

2018

2019

2020

2021

GPT-2

RoBERTa

Codex

GPT-3

Switch

175B

1.5B

T5

11B

12B

Year

1.6T

1

10

100

1000

Gopher

280B

GPT-3

GPT-3 in action

The day started as any other day with a couple of hours of intense work before lunch.

The day started as any other day

Prompt

Hyperparameters

Response

.

🐠

Conversational AI

Question: Who discovered America?

Answer: Christopher Columbus.

Question: What year was that?

Answer: 3054.

Question: Who discovered America?

Answer: Christopher Columbus.

Question: What year was that?

Question: Who discovered America?

Answer: Christopher Columbus.

Question:

Question: Who discovered America?

That doesn't look right...

AI generated text may be...

Biased

Non-factual

Inaccurate

1+1= 3

GPT-3 Model Stages

Design

Predictions

Training

GPT-3

GPT-3

GPT-3

GPT-3 Model

  • Released: June 2020
  • Common Crawl, WebText2 (Reddit), Books1, Books2 and Wikipedia
  • 175B parameters, 96 layers, 12,288 dimensions
  • Total 550GB

GPT-3

GPT-3 Model

GPT-3

12,288

dimensions

550GB

size

96

layers

175B

parameters

CommonCrawl

WebText2

 

Books1 + Books2

 

Wikipedia

Training dataset

96

June 2020

release date

GPT-3 Training DataSet

499 Billion tokens

+100 Languages

Russian

Romanian

Polish

Danish

Japanese

Swedish

Finish

French

German

Italian 

Portuguese

Dutch

English

Spanish

Guten Morgen!

Training phase

Adjust model predictions using output

Wikipedia

Christopher  is

Christopher Columbus was

Input

Output

GPT-3

Christopher Columbus discovered  America

Christopher Columbus discovered America  in

Christopher Columbus discovered America in  1492

Christopher Columbus discovered America in 1492  .

Christopher Columbus discovered America in 1492.

Christopher Columbus discovered America in 1492.

Christopher Columbus discovered America in 1492.

Christopher Columbus discovered America in 1492.

Christopher Columbus discovered America in 1492.

Christopher Columbus discovered America in 1492 .

Hyper-dimensional Graph

who

is

America

Columbus

discover

d_1
d_{2}
d_2
d_{12288}
d_{1}
d_{3}

Predictions

Model prediction

Prompt template

Consider just this word

Consider the whole phrase

Consider the whole prompt

Who

Who discovered America?

Question: who discovered

America?

Answer:

Who is

Who discovered America? I wonder

Question: Who discovered America?
Answer: Christopher Columbus

Input

Output

GPT-3

Coding Tools

// Reverse a string in one line
function reverseString(str) {
 return str.split('').reverse().join('');
} 
// Reverse a string in one line

How did this happen!!?

GPT-3 can be used for...

Writing

Tools

Automated Summaries

Translations

Coding

Tools

Conversational

AI

Instructions

Codex

Codex Model

  • Released: August 2021
  • GitHub snapshot (May-2020)
  • 54M public repositories
  • 12B parameters, 96 layers, 12,288 dimensions
  • Total 179GB, 159GB sanitised

Codex Language Support

Perl

Go

Ruby

Python

JavaScript

TypeScript

SQL

PHP

Shell

Rust

Swift

C#

This is only the beginning

The future of AI is now

Try it!

Request Access

Copilot Extension

Technical Preview

OpenAI Codex

OpenAI Playground

GPT-J-6B

Copy.ai

How GitHub Copilot learned JavaScript

By Gerard Sans

How GitHub Copilot learned JavaScript

I mean… where do I even start? As a JavaScript Developer the struggle is real. There's JavaScript fatigue, new releases, new APIs, and the list keeps growing. How is it possible that an AI beat us all to it and has the audacity to come up with code recommendations!? No need to look into stackoverflow, you write a comment with a description and a code snippet is proposed to you. In this talk, I will look into the milestones that allowed openAI, a company in SF founded by Elon Musk, to achieve such incredible results. We will explore together all its capabilities and some of the more technical details like GPT models, transformers, codex, what's the current progress and what we can expect in the future!

  • 1,942