Все доклады Scale By The Bay 2021 доступны https://www.youtube.com/playlist?list=PLNESult6cnOkFhdJMJXMM1ikifyE6ZvzM
Forwarded from Афиша JetBrains Research
Семинар лаборатории PLT
Designing Efficient Systems with Multi-Stage Programming and Other Type-Safe Metaprogramming Techniques
🕑 20 декабря 2021г. (понедельник), 13:30 МСК
❗️Обратите внимание: необычное время начала доклада
🗣 Lionel Parreaux
📃 In this talk, I will present the results of my PhD at EPFL, Lausanne, which I obtained in 2020.
Software engineering practices have been steadily moving towards higher-level programming languages and away from lower-level ones. High-level languages tend to greatly improve safety, productivity, and code maintainability because they handle various implementation details automatically, allowing programmers to focus on their problem domains.
However, the gains offered by high-level languages are usually made at the cost of reduced performance: higher-level languages usually consume more memory, run more slowly and require expensive garbage-collecting runtime systems. This trend has been worsening with the increasing adoption of the functional programming paradigm by the industry. Modern programmers are thus faced with a dilemma: should they favor productivity and lower maintenance costs at the expense of performance, or should they focus on performance, to the detriments of almost everything else?
The main idea behind my PhD thesis was that we can help solve this dilemma by making advances in type systems, metaprogramming, and compilers technology. In particular, we study how metaprogramming via statically-typed quasiquotation can let programmers define their own domain-specific optimizations in a safe way, while leveraging the latest advances in intermediate program representations.
We will look at the design and implementation of the Squid metaprogramming framework, which extends the Scala programming language with multi-staged programming capabilities and more. We will study a few of Squid's application examples, which include a polymorphic yet efficient library for linear algebra, a stream fusion engine improving on the state of the art, a demonstration of query compilation by rewriting, a staged SQL database system prototype, and a new embedded domain-specific language for expressing queries over collections of data.
📚Type-Safe Metaprogramming and Compilation Techniques For Designing Efficient Systems in High-Level Languages (https://infoscience.epfl.ch/record/281735?ln=en)
📚Finally, a Polymorphic Linear Algebra Language (https://drops.dagstuhl.de/opus/volltexte/2019/10817/)
🇷🇺 Язык встречи: английский
🔗 Ссылка на встречу: https://meet.google.com/myu-dhmz-gvu
🗓 Добавить в календарь
📼 Запись этой встречи и других семинаров лаборатории можно найти на YouTube.
#семинар, #plt, #Multi_stage_programming, #metaprogramming, #type_safety, #extensible_compilers, #Scala
Designing Efficient Systems with Multi-Stage Programming and Other Type-Safe Metaprogramming Techniques
🕑 20 декабря 2021г. (понедельник), 13:30 МСК
❗️Обратите внимание: необычное время начала доклада
🗣 Lionel Parreaux
📃 In this talk, I will present the results of my PhD at EPFL, Lausanne, which I obtained in 2020.
Software engineering practices have been steadily moving towards higher-level programming languages and away from lower-level ones. High-level languages tend to greatly improve safety, productivity, and code maintainability because they handle various implementation details automatically, allowing programmers to focus on their problem domains.
However, the gains offered by high-level languages are usually made at the cost of reduced performance: higher-level languages usually consume more memory, run more slowly and require expensive garbage-collecting runtime systems. This trend has been worsening with the increasing adoption of the functional programming paradigm by the industry. Modern programmers are thus faced with a dilemma: should they favor productivity and lower maintenance costs at the expense of performance, or should they focus on performance, to the detriments of almost everything else?
The main idea behind my PhD thesis was that we can help solve this dilemma by making advances in type systems, metaprogramming, and compilers technology. In particular, we study how metaprogramming via statically-typed quasiquotation can let programmers define their own domain-specific optimizations in a safe way, while leveraging the latest advances in intermediate program representations.
We will look at the design and implementation of the Squid metaprogramming framework, which extends the Scala programming language with multi-staged programming capabilities and more. We will study a few of Squid's application examples, which include a polymorphic yet efficient library for linear algebra, a stream fusion engine improving on the state of the art, a demonstration of query compilation by rewriting, a staged SQL database system prototype, and a new embedded domain-specific language for expressing queries over collections of data.
📚Type-Safe Metaprogramming and Compilation Techniques For Designing Efficient Systems in High-Level Languages (https://infoscience.epfl.ch/record/281735?ln=en)
📚Finally, a Polymorphic Linear Algebra Language (https://drops.dagstuhl.de/opus/volltexte/2019/10817/)
🇷🇺 Язык встречи: английский
🔗 Ссылка на встречу: https://meet.google.com/myu-dhmz-gvu
🗓 Добавить в календарь
📼 Запись этой встречи и других семинаров лаборатории можно найти на YouTube.
#семинар, #plt, #Multi_stage_programming, #metaprogramming, #type_safety, #extensible_compilers, #Scala
Infoscience
Type-Safe Metaprogramming and Compilation Techniques For Designing Efficient Systems in High-Level Languages
Software engineering practices have been steadily moving towards higher-level programming languages and away from lower-level ones. High-level languages tend to greatly improve safety, productivity, and code maintainability because they handle various implementation…
https://arxiv.org/abs/2010.06496 Abramsky, Shah, [2021] "Relating Structure and Power: Extended Version"
There is a remarkable divide in the field of logic in Computer Science, between two distinct strands: one focusing on semantics and compositionality (“Structure”), the other on expressiveness and complexity (“Power”). It is remarkable because these two fundamental aspects of our field are studied using almost disjoint technical languages and methods, by almost disjoint research communities. We believe that bridging this divide is a major issue in Computer Science, and may hold the key to fundamental advances in the field.
In this paper, we develop a novel approach to relating categorical semantics, which exemplifies the first strand, to finite model theory, which exemplifies the second.
There is a remarkable divide in the field of logic in Computer Science, between two distinct strands: one focusing on semantics and compositionality (“Structure”), the other on expressiveness and complexity (“Power”). It is remarkable because these two fundamental aspects of our field are studied using almost disjoint technical languages and methods, by almost disjoint research communities. We believe that bridging this divide is a major issue in Computer Science, and may hold the key to fundamental advances in the field.
In this paper, we develop a novel approach to relating categorical semantics, which exemplifies the first strand, to finite model theory, which exemplifies the second.
Forwarded from Absolute Nikola
Всем привет! Выложил записи с последнего митапа 🍿
- Генерация некоторых пруф-термов на coq-elpi и рекурсивные инстансы тайпклассов на coq для гетерогенных коллекций (Андрей Ляшин)
https://youtu.be/PG7TkmiIkko
- Он нам [не особо] нужон ORM ваш или как мы написали dsql (Артём Алексеев)
https://youtu.be/oqlddGlTJOM
- Демонстрация применения fp-ts совместно с React в простых примерах (Сергей Гулин)
https://youtu.be/L8quhHCCNRU
- Генерация некоторых пруф-термов на coq-elpi и рекурсивные инстансы тайпклассов на coq для гетерогенных коллекций (Андрей Ляшин)
https://youtu.be/PG7TkmiIkko
- Он нам [не особо] нужон ORM ваш или как мы написали dsql (Артём Алексеев)
https://youtu.be/oqlddGlTJOM
- Демонстрация применения fp-ts совместно с React в простых примерах (Сергей Гулин)
https://youtu.be/L8quhHCCNRU
YouTube
Генерация некоторых пруф-термов на coq-elpi (Андрей Ляшин)
Генерация некоторых пруф-термов на coq-elpi и рекурсивные инстансы тайпклассов на coq для гетерогенных коллекций (Андрей Ляшин)
Phase Distinctions in Type Theory (a talk) https://www.cs.cmu.edu/~rwh/talks/ti-talk.pdf
by R. Harper et al.
The second part is about Calf, Cost-Aware Logical Framework
by R. Harper et al.
The second part is about Calf, Cost-Aware Logical Framework
A cost-aware logical framework, by Y. Niu, J. Sterling, H. Grodin, R. Harper
We present calf, a cost-aware logical framework for studying quantitative aspects of functional programs. Taking inspiration from recent work that reconstructs traditional aspects of programming languages in terms of a modal account of phase distinctions, we argue that the cost structure of programs motivates a phase distinction between intension and extension. Armed with this technology, we contribute a synthetic account of cost structure as a computational effect in which cost-aware programs enjoy an internal noninterference property: input/output behavior cannot depend on cost. As a full-spectrum dependent type theory, calf presents a unified language for programming and specification of both cost and behavior that can be integrated smoothly with existing mathematical libraries available in type theoretic proof assistants. We evaluate calf as a general framework for cost analysis by implementing two fundamental techniques for algorithm analysis: the method of recurrence relations and physicist’s method for amortized analysis. We deploy these techniques on a variety of case studies: we prove a tight, closed bound for Euclid’s algorithm, verify the amortized complexity of batched queues, and derive tight, closed bounds for the sequential and parallel complexity of merge sort, all fully mechanized in the Agda proof assistant. Lastly we substantiate the soundness of quantitative reasoning in calf by means of a model construction.
https://arxiv.org/pdf/2107.04663.pdf
We present calf, a cost-aware logical framework for studying quantitative aspects of functional programs. Taking inspiration from recent work that reconstructs traditional aspects of programming languages in terms of a modal account of phase distinctions, we argue that the cost structure of programs motivates a phase distinction between intension and extension. Armed with this technology, we contribute a synthetic account of cost structure as a computational effect in which cost-aware programs enjoy an internal noninterference property: input/output behavior cannot depend on cost. As a full-spectrum dependent type theory, calf presents a unified language for programming and specification of both cost and behavior that can be integrated smoothly with existing mathematical libraries available in type theoretic proof assistants. We evaluate calf as a general framework for cost analysis by implementing two fundamental techniques for algorithm analysis: the method of recurrence relations and physicist’s method for amortized analysis. We deploy these techniques on a variety of case studies: we prove a tight, closed bound for Euclid’s algorithm, verify the amortized complexity of batched queues, and derive tight, closed bounds for the sequential and parallel complexity of merge sort, all fully mechanized in the Agda proof assistant. Lastly we substantiate the soundness of quantitative reasoning in calf by means of a model construction.
https://arxiv.org/pdf/2107.04663.pdf
Cost-Aware Type Theory
Y. Niu, R. Harper
Although computational complexity is a fundamental aspect of program behavior, it is often at odds with common type theoretic principles such as function extensionality, which identifies all functions with the same input-output behavior. We present a computational type theory called CATT that has a primitive notion of cost (the number of evaluation steps). We introduce a new dependent function type “funtime” whose semantics can be viewed as a cost-aware version of function extensionality. We prove a collection of lemmas for CATT, including a novel introduction rule for the new funtime type. CATT can be simultaneously viewed as a framework for analyzing computational complexity of programs and as the beginnings of a semantic foundation for characterizing feasible mathematical proofs.
https://arxiv.org/pdf/2011.03660.pdf
Y. Niu, R. Harper
Although computational complexity is a fundamental aspect of program behavior, it is often at odds with common type theoretic principles such as function extensionality, which identifies all functions with the same input-output behavior. We present a computational type theory called CATT that has a primitive notion of cost (the number of evaluation steps). We introduce a new dependent function type “funtime” whose semantics can be viewed as a cost-aware version of function extensionality. We prove a collection of lemmas for CATT, including a novel introduction rule for the new funtime type. CATT can be simultaneously viewed as a framework for analyzing computational complexity of programs and as the beginnings of a semantic foundation for characterizing feasible mathematical proofs.
https://arxiv.org/pdf/2011.03660.pdf
Forwarded from Just links
A Neural Network Solves and Generates Mathematics Problems by Program Synthesis: Calculus, Differential Equations, Linear Algebra, and More https://arxiv.org/abs/2112.15594
Symbolic and Automatic Differentiation of Languages,
Conal Elliott
http://conal.net/papers/language-derivatives/
Conal Elliott
http://conal.net/papers/language-derivatives/
conal.net
Symbolic and Automatic Differentiation of Languages
Contextual Effect Polymorphism Meets Bidirectional Effects (Extended Abstract)
https://youtu.be/Ws4vvOJ5c1k
https://youtu.be/Ws4vvOJ5c1k
YouTube
TyDe 2021 - Contextual Effect Polymorphism Meets Bidirectional Effects (Extended Abstract)
https://icfp21.sigplan.org/details/TyDe-2021/8/Contextual-Effect-Polymorphism-Meets-Bidirectional-Effects-Extended-Abstract-
Forwarded from Sergey Kucherenko
actris2.pdf
700.6 KB
Actris 2.0: Asynchronous session-type based reasoning in separation logic
by J. Kastberg Hinrichsen, J. Bengtson and R. Krebbers
by J. Kastberg Hinrichsen, J. Bengtson and R. Krebbers
Deep neural networks as nested dynamical systems
David I. Spivak, Timothy Hosgood
There is an analogy that is often made between deep neural networks and actual brains, suggested by the nomenclature itself: the "neurons" in deep neural networks should correspond to neurons (or nerve cells, to avoid confusion) in the brain. We claim, however, that this analogy doesn't even type check: it is structurally flawed. In agreement with the slightly glib summary of Hebbian learning as "cells that fire together wire together", this article makes the case that the analogy should be different. Since the "neurons" in deep neural networks are managing the changing weights, they are more akin to the synapses in the brain; instead, it is the wires in deep neural networks that are more like nerve cells, in that they are what cause the information to flow. An intuition that nerve cells seem like more than mere wires is exactly right, and is justified by a precise category-theoretic analogy which we will explore in this article. Throughout, we will continue to highlight the error in equating artificial neurons with nerve cells by leaving "neuron" in quotes or by calling them artificial neurons.
We will first explain how to view deep neural networks as nested dynamical systems with a very restricted sort of interaction pattern, and then explain a more general sort of interaction for dynamical systems that is useful throughout engineering, but which fails to adapt to changing circumstances. As mentioned, an analogy is then forced upon us by the mathematical formalism in which they are both embedded. We call the resulting encompassing generalization deeply interacting learning systems: they have complex interaction as in control theory, but adaptation to circumstances as in deep neural networks.
https://arxiv.org/abs/2111.01297
David I. Spivak, Timothy Hosgood
There is an analogy that is often made between deep neural networks and actual brains, suggested by the nomenclature itself: the "neurons" in deep neural networks should correspond to neurons (or nerve cells, to avoid confusion) in the brain. We claim, however, that this analogy doesn't even type check: it is structurally flawed. In agreement with the slightly glib summary of Hebbian learning as "cells that fire together wire together", this article makes the case that the analogy should be different. Since the "neurons" in deep neural networks are managing the changing weights, they are more akin to the synapses in the brain; instead, it is the wires in deep neural networks that are more like nerve cells, in that they are what cause the information to flow. An intuition that nerve cells seem like more than mere wires is exactly right, and is justified by a precise category-theoretic analogy which we will explore in this article. Throughout, we will continue to highlight the error in equating artificial neurons with nerve cells by leaving "neuron" in quotes or by calling them artificial neurons.
We will first explain how to view deep neural networks as nested dynamical systems with a very restricted sort of interaction pattern, and then explain a more general sort of interaction for dynamical systems that is useful throughout engineering, but which fails to adapt to changing circumstances. As mentioned, an analogy is then forced upon us by the mathematical formalism in which they are both embedded. We call the resulting encompassing generalization deeply interacting learning systems: they have complex interaction as in control theory, but adaptation to circumstances as in deep neural networks.
https://arxiv.org/abs/2111.01297
arXiv.org
Deep neural networks as nested dynamical systems
There is an analogy that is often made between deep neural networks and actual brains, suggested by the nomenclature itself: the "neurons" in deep neural networks should correspond to neurons (or...
Algorithmics
IFIP’s Working Group 2.1 (Richard Bird‚ Jeremy Gibbons‚ Ralf Hinze‚ Peter Hoefner‚ Johan Jeuring‚ Lambert Meertens‚ Bernhard Moeller‚ Carroll Morgan‚ Tom Schrijvers‚ Wouter Swierstra and Nicolas Wu )
http://www.cs.ox.ac.uk/people/jeremy.gibbons/publications/ifip60.pdf
IFIP’s Working Group 2.1 (Richard Bird‚ Jeremy Gibbons‚ Ralf Hinze‚ Peter Hoefner‚ Johan Jeuring‚ Lambert Meertens‚ Bernhard Moeller‚ Carroll Morgan‚ Tom Schrijvers‚ Wouter Swierstra and Nicolas Wu )
http://www.cs.ox.ac.uk/people/jeremy.gibbons/publications/ifip60.pdf
Continuation-Passing Style, Defunctionalization, Accumulations, and Associativity
J. Gibbons
http://www.cs.ox.ac.uk/jeremy.gibbons/publications/continued.pdf
J. Gibbons
http://www.cs.ox.ac.uk/jeremy.gibbons/publications/continued.pdf
Effective Metatheory for Type Theory, thesis
by P. Haselwarter
https://haselwarter.org/assets/pdfs/effective-metatheory-for-type-theory.pdf
by P. Haselwarter
https://haselwarter.org/assets/pdfs/effective-metatheory-for-type-theory.pdf
A guide to immunotherapy for COVID-19
https://www.nature.com/articles/s41591-021-01643-9
https://www.nature.com/articles/s41591-021-01643-9
Nature
A guide to immunotherapy for COVID-19
Nature Medicine - This Review aims to support clinical decision-making by providing an overview of the evidence for immunotherapy strategies in patients with COVID-19.