Creating Programming Languages in non-English
Felienne Hermans
Verona: co-designing the type system, concurrency model, and runtime system Verona Regions
Sophia Drossopoulou
Verona is a language designed and implemented by teams at Microsoft, Uppsalla and Imperial College. Building on the teams’ experiences with Pony, the new language aims to support high-level, high-performance, highly-convenience, safe systems programming. Verona extends the actor paradigm with asynchronous execution so that multiple actors may form temporal alliances and execute behaviours in concert. It has an ownership-based type system that ensures data race freedom and supports fine-grained, region-specific garbage collection. I will introduce Verona through examples and explore the design space of causal ordering for behaviours and their ramifications when reasoning about Verona. I will outline the implementation of a C++ runtime and compare Verona to Pony.
On-the-fly concurrent garbage collection formalization and verification
Tony Hosking
I introduce the basic principles of concurrent garbage collection, including on-the-fly garbage collection, focusing on the correctness challenges for garbage collectors due to concurrent mutation by the application program. We will then discuss approaches to solving these problems, and characterize these solutions with respect to an abstract garbage collection algorithm in terms of invariants that preserve the safety of the algorithm. I then discuss a concrete garbage collection algorithm designed to operate efficiently on weak memory multicore hardware, and argue informally for its correctness. I then describe how we have been able to formally and mechanically verify the correctness of this algorithm for weak memory using the Isabelle proof assistant.
Advances in production concurrent collectors over the past decade
Tony Hosking
We will look at the evolution of production concurrent collectors over the past decade, treating primarily concurrent marking and compaction. Collectors described include Compressor, Azul’s Pauseless/C4 and Collie, OpenJDK’s ZGC and Generational ZGC, OpenJDK’s Shenandoah, and LXR (a mostly non-concurrent, mostly reference-counting compacting collector as counterpoint).
Interpreters: Everywhere and All the Time PDF
Stefan Marr
Implementers often start with an interpreter to sketch how a language may work. They are easy to implement and great to experiment with. However, they are also an essential part of dynamic language implementations. We will talk about the basics of abstract syntax trees, bytecodes, and how these ideas can be used to implement a language. We will also look into optimizations for interpreters: how AST and bytecode interpreters can use run-time feedback to improve performance, and discuss how super nodes and super instructions allows us to make effective use of modern CPUs.
A Brief Introduction to Just-in-Time Compilation
Stefan Marr
Since the early days of object-oriented languages, run-time polymorphism has been a challenge for implementers. Smalltalk and Self pushed many ideas to an extreme, their implementers had to invent techniques such as: lookup caches, tracing and method-based compilation, deoptimization, and maps. While these ideas originated in the '80s and ‘90s, they are key ingredients of today’s just-in-time compilers for Java, Ruby, Python, JavaScript.
JavaScript AOT compilation GIT
Manuel Serrano
JavaScript is difficult to implement efficiently because expressions have meanings that involve all sorts of different executions not distinguished by any syntactic or type annotation. The intuition is that only JIT compilers can handle it efficiently because they rely on heuristic-based strategies that require both program and data. Static (AOT) compilers can speculate and use heuristics too! To demonstrate, we propose Hopc, an AOT compiler for JavaScript, based on the following assumption: The most likely data structure a program will use is the one for which the compiler can produce its best code. Thus, contrary to most AOTs, Hopc does not rely on complex static analyses. It simply generates its best code that it protects with guards. In this lecture, I will present the basic techniques for compiling classical dynamically typed functional programming such as Scheme, and then dig into the specificities of JavaScript.
Programming languages on the web, now and the future PDF
Michael Lippautz
The web is ubiquitous. In this talk we’ll take a whirlwind tour exploring how this system ticks: From a high-level overview of how the web still evolves (quite significantly) down to Chromium (the browser), Blink (the rendering engine), and V8 (the JavaScript virtual machine) as a concrete implementation of the web platform – we’ll cover it all! For V8 specifically, we’ll take a look at its object model, interpreters, various compilers, and garbage collection and see how it accommodates the web’s unique requirements. We’ll see how Blink provides access to the document object model (DOM) via V8’s foreign function interface. We don’t stop with polyglot systems there but rather continue in looking at the future of the web’s programming languages with WebAssembly and how it offers new capabilities.
Solidifying the Software Foundationse
Zhendong Su
Software applications and technologies are built on top of foundational systems such as compilers, databases, and theorem provers. Such foundations form the trusted computing base, and fundamentally impact software quality and security. Thus, solidifying them is a critical challenge. These lectures will introduce general, effective techniques, and extensive, impactful efforts on finding thousands of critical issues in widely-used compilers, database management systems, and SMT solvers. They will focus on the high-level principles and core techniques, their significant practical successes, and future opportunities and challenges.
Programming Language Foundations in Agda
Phil Wadler
The most profound connection between logic and computation is a pun. The doctrine of Propositions as Types asserts that a certain kind of formal structure may be read in two ways: either as a proposition in logic or as a type in computing. Further, a related structure may be read as either the proof of the proposition or as a programme of the corresponding type. Further still, simplification of proofs corresponds to evaluation of programs. Accordingly, the title of this course also has two readings. It may be parsed as “(Programming Language) Foundations in Agda” or “Programming (Language Foundations) in Agda” — the specifications we will write in the proof assistant Agda both describe programming languages and are themselves programmes. The textbook for the course is a literate Agda script, freely available online at https://plfa.inf.ed.ac.uk.
Convergent Evolution: Everybody's talking about Regions and Lifetimes, but it's all Ownership to me Slides
James Noble
Rust has popularised ownership type systems that grant programmers explicit control or their programs' memory management, within the context of a practical systems programming language. Rust's design draws on more than 30 years of work across many different programming language research communities: linear functional programming (ICFP), region allocation (PLDI), ownership types (OOPSLA), alias burying (ECOOP). This talk -- based on a POPL 2023 tutorial curated jointly with Tobias Wrigstad -- will attempt to argue that Rust's lifetimes, ML's regions, Pony's capabilities, C++'s unique_ptr are examples of Convergent Evolution in programming language design. Compressing six hours of material from a nominally three-hour tutorial into a 3 second "blipvert", I'll go on to speculate about likely further trajectories of language designs and ownership.