Track: 21st Century Languages

Location: Pacific DEKJ

Day of week:

Compile to Native, Microservices, Machine learning... tailor-made languages solving modern challenges, featuring use cases around Go, Rust, C#, and Elm.

Track Host: Kavya Joshi

Software Engineer @Samsara

Kavya Joshi writes code for a living at a start-up in San Francisco. She particularly enjoys architecting and building highly concurrent, highly scalable systems. In her free time, she reads non-fiction and climbs rocks. Before moving to San Francisco to be an Adult, Kavya was at MIT where she got a Bachelor's and Master's in Computer Science.

Writing High Performance Go

Go programs are often deployed in environments where low latency and high throughput are a must. In this talk, we'll study three aspects of writing high-performance Go applications, including:

  • How to write effective benchmarks (and interpret their results, including some traps for young players and advice on how to avoid them).
  • How to use the tools built into the Go runtime to gain an understanding of how your application is performing.
  • Understanding the Go garbage collector and writing GC friendly code.

Dave Cheney, Software Engineer & Core Contributor to Go-lang

The Why of Go

Come learn about Go through a time-travel historical lens perspective. What limits did we meet at the end of the 20th century that Go was developed for? In providing the historical context around the technical decisions of the language, you can better understand its concurrency primitives, garbage collection, and small standard library. Learn about the challenges language architects face in the pursuit of simplicity.

Carmen Andoh, Software Engineer on the Build Infrastructure team @TravisCI

Herding Nulls and Other C# Stories From the Future

C# is evolving at a rather vigorous pace, aiming for new levels of expressiveness on many fronts. Let’s plant our feet firmly in the air for a bit and look at some of the places we think it’s headed: Finally reining in those pesky nulls, fighting back on callback hell for asynchronous streams, the extension of everything, and so on. Likely honorable mention of pattern matching, type classes, discriminated unions and exploding heads. You don’t have to be caught up on C# to follow.

Mads Torgersen, Chief Language Designer of C# & Contributor to TypeScript, Visual Basic, Roslyn, LINQ

[Cancelled] Rust: Systems Programming

Rust is an exciting new systems programming language that combines low-level control and predictability with the safety and ergonomics of a high-level language. Rust’s superpower is a set of concepts called “ownership” and “borrowing” which enable you to write exceptionally performant and reliable code, with the compiler acting as an assistant and multiplying factor for your work.

In this talk, we’ll explore the core concepts of Rust and how they guarantee memory and thread safety without expensive and unpredictable runtime systems like garbage collection. We’ll also look at how these guarantees can be impactful in other ways - enabling users to go beyond the limitations of their current environments, and supporting maintainability and reliability as projects grow in scope and size.

Without Boats, Rust Contributor (Language & Cargo Team Member)

Next Gen Networking Infrastructure With Rust

As the world becomes ever more connected, the scale and sophistication of network infrastructure software is increasing dramatically. However, the requirements for this software are as stringent as ever: it must not only be fast, it must be “safe”, i.e. able to process untrusted data without crashing or being vulnerable to security exploits. . Traditionally, these two requirements have been at odds: network programmers had to pick a language that either offered speed or safety.

Enter Rust, a new language that lets you have your cake and eat it too. Rust compiles to native code, while ensuring at compile time that the resulting  programs are memory safe.

But Rust provides even more: its higher-order abstractions enable expressive programs and powerful memory management techniques without imposing a runtime cost. In this talk, we’ll show how Rust’s “zero cost abstractions” can be leveraged to provide a networking platform that provides expressiveness, speed, and safety with tradeoffs between them. We describe how these techniques are used in Linkerd, a “service mesh” proxy for the next generation of cloud native applications.

Carl Lerche, Software Engineer & Rust Contributor

Last Year's Tracks

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.