Rusts Journey to Async/Await Steve Klabnik Hi, Im Steve! On the - - PowerPoint PPT Presentation

rust s journey to async await
SMART_READER_LITE
LIVE PREVIEW

Rusts Journey to Async/Await Steve Klabnik Hi, Im Steve! On the - - PowerPoint PPT Presentation

Rusts Journey to Async/Await Steve Klabnik Hi, Im Steve! On the Rust team Work at Cloudflare Doing two workshops! Parallel: do multiple things at once What is async? Concurrent: do multiple things, not at once


slide-1
SLIDE 1

Rust’s Journey to Async/Await

Steve Klabnik

slide-2
SLIDE 2

Hi, I’m Steve!

  • On the Rust

team

  • Work at

Cloudflare

  • Doing two

workshops!

slide-3
SLIDE 3
slide-4
SLIDE 4

What is async?

Parallel: do multiple things at

  • nce

Concurrent: do multiple things, not at once Asynchronous: actually unrelated! Sort of...

slide-5
SLIDE 5

“Task”

A generic term for some computation running in a parallel or concurrent system

slide-6
SLIDE 6

Parallel

Only possible with multiple cores or CPUs

slide-7
SLIDE 7

Concurrent

Pretend that you have multiple cores or CPUs

slide-8
SLIDE 8

Asynchronous

A word we use to describe language features that enable parallelism and/or concurrency

slide-9
SLIDE 9

Even more terminology

slide-10
SLIDE 10

Cooperative vs Preemptive Multitasking

slide-11
SLIDE 11

Cooperative Multitasking

Each task decides when to yield to other tasks

slide-12
SLIDE 12

Preemptive Multitasking

The system decides when to yield to other tasks

slide-13
SLIDE 13

Native vs green threads

slide-14
SLIDE 14

Native threads

Tasks provided by the

  • perating system

Sometimes called “1:1 threading”

slide-15
SLIDE 15

Green Threads

Tasks provided by your programming language Sometimes called “N:M threading”

slide-16
SLIDE 16

Native vs Green threads

Native thread advantages:

  • Part of your system; OS handles

scheduling

  • Very straightforward,

well-understood Native thread disadvantages:

  • Defaults can be sort of heavy
  • Relatively limited number you can

create Green thread advantages:

  • Not part of the overall system;

runtime handles scheduling

  • Lighter weight, can create many,

many, many, many green threads Green thread disadvantages:

  • Stack growth can cause issues
  • Overhead when calling into C
slide-17
SLIDE 17

Why do we care?

slide-18
SLIDE 18
slide-19
SLIDE 19

Apache

“Pre-fork”

Control Process Child Process

slide-20
SLIDE 20

Apache

“worker”

Control Process Child Process Child Thread Child Thread Child Thread Child Thread Thread pool

slide-21
SLIDE 21

Let’s talk about Rust

slide-22
SLIDE 22

Rust was built to enhance Firefox, which is an HTTP client, not server

slide-23
SLIDE 23
slide-24
SLIDE 24
slide-25
SLIDE 25

“Synchronous, non-blocking network I/O”

slide-26
SLIDE 26

Isn’t this a contradiction in terms?

slide-27
SLIDE 27

Synchronous Asynchronous Blocking Old-school implementations Doesn’t make sense Non-blocking Go, Ruby Node.js

slide-28
SLIDE 28

Tons of options

Synchronous, blocking

  • Your code looks like it blocks, and

it does block

  • Very basic and straightforward

Asynchronous, non-blocking

  • Your code looks like it doesn’t

block, and it doesn’t block

  • Harder to write

Synchronous, non-blocking

  • Your code looks like it blocks, but it

doesn’t!

  • The secret: the runtime is

non-blocking

  • Your code still looks straightforward,

but you get performance benefits

  • A common path for languages built
  • n synchronous, blocking I/O to gain

performance while retaining compatibility

slide-29
SLIDE 29

Not all was well in Rust-land

slide-30
SLIDE 30
slide-31
SLIDE 31

A “systems programming language” that doesn’t let you use the system’s threads?

slide-32
SLIDE 32
slide-33
SLIDE 33
slide-34
SLIDE 34
slide-35
SLIDE 35

Not all was well in Rust-land

slide-36
SLIDE 36
slide-37
SLIDE 37

Rust 1.0 was approaching

slide-38
SLIDE 38

Ship the minimal thing that we know is good

slide-39
SLIDE 39

Rust 1.0 was released! 🎊

slide-40
SLIDE 40

… but still, not all was well in Rust-land

slide-41
SLIDE 41

People 💗 Rust

slide-42
SLIDE 42

People want to build network services in Rust

slide-43
SLIDE 43

Rust is supposed to be a high-performance language

slide-44
SLIDE 44

Rust’s I/O model feels retro, and not performant

slide-45
SLIDE 45

The big problem with native threads for I/O

slide-46
SLIDE 46

CPU bound vs I/O bound

slide-47
SLIDE 47

CPU Bound

The speed of completing a task is based on the CPU crunching some numbers My processor is working hard

slide-48
SLIDE 48

I/O Bound

The speed of completing a task is based on doing a lot of input and output Doing a lot of networking

slide-49
SLIDE 49

When you’re doing a lot of I/O, you’re doing a lot

  • f waiting
slide-50
SLIDE 50

When you’re doing a lot of waiting, you’re tying up system resources

slide-51
SLIDE 51

Go

Asynchronous I/O with green threads (Erlang does this too)

Main Process Child Thread Child Thread Child Thread Child Thread Green threads

slide-52
SLIDE 52

Native vs Green threads

Native thread advantages:

  • Part of your system; OS handles

scheduling

  • Very straightforward,

well-understood Native thread disadvantages:

  • Defaults can be sort of heavy
  • Relatively limited number you can

create Green thread advantages:

  • Not part of the overall system;

runtime handles scheduling

  • Lighter weight, can create many,

many, many, many green threads Green thread disadvantages:

  • Stack growth can cause issues
  • Overhead when calling into C

PREVIOUSLY

slide-53
SLIDE 53

A “systems programming language” that has overhead when calling into C code?

slide-54
SLIDE 54

Luckily, there is another way

slide-55
SLIDE 55

Nginx

Asynchronous I/O

Event Loop

slide-56
SLIDE 56
slide-57
SLIDE 57

Evented I/O requires non-blocking APIs

slide-58
SLIDE 58

Blocking vs non-blocking

slide-59
SLIDE 59

“Callback hell”

slide-60
SLIDE 60
slide-61
SLIDE 61
slide-62
SLIDE 62

Promises

let myFirstPromise = new Promise((resolve, reject) => { setTimeout(function(){ resolve("Success!"); }, 250); }); myFirstPromise.then((successMessage) => { console.log("Yay! " + successMessage); });

slide-63
SLIDE 63

Promises

let myFirstPromise = new Promise((resolve, reject) => { setTimeout(function(){ resolve("Success!"); }, 250); }); myFirstPromise.then((successMessage) => { console.log("Yay! " + successMessage); }).then((...) => { // }).then((...) => { // });

slide-64
SLIDE 64
slide-65
SLIDE 65
slide-66
SLIDE 66

Futures 0.1

pub trait Future { type Item; type Error; fn poll(&mut self) -> Poll<Self::Item, Self::Error>; } id_rpc(&my_server).and_then(|id| { get_row(id) }).map(|row| { json::encode(row) }).and_then(|encoded| { write_string(my_socket, encoded) })

slide-67
SLIDE 67

Promises and Futures are different!

  • Promises are built into JavaScript
  • The language has a runtime
  • This means that Promises start

executing upon creation

  • This feels simpler, but has some

drawbacks, namely, lots of allocations

  • Futures are not built into Rust
  • The language has no runtime
  • This means that you must submit

your futures to an executor to start execution

  • Futures are inert until their poll

method is called by the executor

  • This is slightly more complex, but

extremely efficient; a single, perfectly sized allocation per task!

  • Compiles into the state machine

you’d write by hand with evented I/O

slide-68
SLIDE 68

Futures 0.1: Executors

use tokio; fn main() { let addr = "127.0.0.1:6142".parse().unwrap(); let listener = TcpListener::bind(&addr).unwrap(); let server = listener.incoming().for_each(|socket| { Ok(()) }) .map_err(|err| { println!("accept error = {:?}", err); }); println!("server running on localhost:6142"); tokio::run(server); }

slide-69
SLIDE 69

We used Futures 0.1 to build stuff!

slide-70
SLIDE 70

The design had some problems

slide-71
SLIDE 71

Futures 0.2

trait Future { type Item; type Error; fn poll(&mut self, cx: task::Context) -> Poll<Self::Item, Self::Error>; } No implicit context, no more need for thread local storage.

slide-72
SLIDE 72
slide-73
SLIDE 73
slide-74
SLIDE 74

// with callback request('https://google.com/', (response) => { // handle response }) // with promise request('https://google.com/').then((response) => { // handle response }); // with async/await async function handler() { let response = await request('https://google.com/') // handle response }

Async/await

slide-75
SLIDE 75

Async/await lets you write code that feels synchronous, but is actually asynchronous

slide-76
SLIDE 76

Async/await is more important in Rust than in

  • ther languages

because Rust has no garbage collector

slide-77
SLIDE 77

Rust example: synchronous

fn read(&mut self, buf: &mut [u8]) -> Result<usize, io::Error>

let mut buf = [0; 1024]; let mut cursor = 0; while cursor < 1024 { cursor += socket.read(&mut buf[cursor..])?; }

slide-78
SLIDE 78

Rust example: async with Futures

fn read<T: AsMut<[u8]>>(self, buf: T) -> impl Future<Item = (Self, T, usize), Error = (Self, T, io::Error)>

… the code is too big to fit on the slide The main problem: the borrow checker doesn’t understand asynchronous code. The constraints on the code when it’s created and when it executes are different.

slide-79
SLIDE 79

Rust example: async with async/await

async { let mut buf = [0; 1024]; let mut cursor = 0; while cursor < 1024 { cursor += socket.read(&mut buf[cursor..]).await?; }; buf }

async/await can teach the borrow checker about these constraints.

slide-80
SLIDE 80

Not all futures can error

trait Future { type Item; type Error; fn poll(&mut self, cx: task::Context) -> Poll<Self::Item, Self::Error>; }

slide-81
SLIDE 81

std::future

pub trait Future { type Output; fn poll(self: Pin<&mut Self>, cx: &mut Context)

  • > Poll<Self::Output>;

} Pin is how async/await teaches the borrow checker. If you need a future that errors, set Output to a Result<T, E>.

slide-82
SLIDE 82

… but one more thing...

slide-83
SLIDE 83

What syntax for async/await?

async is not an issue JavaScript and C# do: await value; But what about ? for error handling? await value?; await (value?); (await value)?;

slide-84
SLIDE 84

What syntax for async/await?

What about chains of await? (await (await value)?);

slide-85
SLIDE 85

We argued and argued and argued and argued and argued and ar...

slide-86
SLIDE 86
slide-87
SLIDE 87

What syntax for async/await?

async { let mut buf = [0; 1024]; let mut cursor = 0; while cursor < 1024 { cursor += socket.read(&mut buf[cursor..]).await?; }; buf }

// no errors future.await // with errors future.await?

slide-88
SLIDE 88

… there’s actually even one last issue that’s popped up

slide-89
SLIDE 89

… this talk is already long enough

slide-90
SLIDE 90

Additional Ergonomic improvements

use runtime::net::UdpSocket; #[runtime::main] async fn main() -> std::io::Result<()> { let mut socket = UdpSocket::bind("127.0.0.1:8080")?; let mut buf = vec![0u8; 1024]; println!("Listening on {}", socket.local_addr()?); loop { let (recv, peer) = socket.recv_from(&mut buf).await?; let sent = socket.send_to(&buf[..recv], &peer).await?; println!("Sent {} out of {} bytes to {}", sent, recv, peer); } }

slide-91
SLIDE 91

WebAssembly?

slide-92
SLIDE 92

WebAssembly?

Promise Future Promise

slide-93
SLIDE 93

Finally landing in Rust 1.37 Or maybe 1.38

slide-94
SLIDE 94

Finally landing in Rust 1.37 Or maybe 1.38

slide-95
SLIDE 95
slide-96
SLIDE 96

Finally landing in Rust 1.38!!!!1

slide-97
SLIDE 97
slide-98
SLIDE 98

Lesson: a world-class I/O system implementation takes years

slide-99
SLIDE 99

Lesson: different languages have different constraints

slide-100
SLIDE 100

Thank you! @steveklabnik