2012
DOI: 10.1145/2103621.2103714
|View full text |Cite
|
Sign up to set email alerts
|

The ins and outs of gradual type inference

Abstract: Gradual typing lets programmers evolve their dynamically typed programs by gradually adding explicit type annotations, which confer benefits like improved performance and fewer run-time failures.However, we argue that such evolution often requires a giant leap, and that type inference can offer a crucial missing step. If omitted type annotations are interpreted as unknown types, rather than the dynamic type, then static types can often be inferred, thereby removing unnecessary assumptions of the dynamic type. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(22 citation statements)
references
References 27 publications
0
22
0
Order By: Relevance
“…For example, [15] perform 226 refactorings and add 177 type annotations to reduce false positives, which eventually leads to the discovery of 8 errors; [22] find that up to 39% of the checked code violates type properties (for the deltablue benchmark, which has one warning from TypeDevil). Optimizing JIT compilers benefit from inferred types to emit type-specialized code [20], [24], [36]. Ahn et al [3] improve the object representation to enable optimizations even though objects with different prototypes appear at a single code location.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, [15] perform 226 refactorings and add 177 type annotations to reduce false positives, which eventually leads to the discovery of 8 errors; [22] find that up to 39% of the checked code violates type properties (for the deltablue benchmark, which has one warning from TypeDevil). Optimizing JIT compilers benefit from inferred types to emit type-specialized code [20], [24], [36]. Ahn et al [3] improve the object representation to enable optimizations even though objects with different prototypes appear at a single code location.…”
Section: Related Workmentioning
confidence: 99%
“…TypeDevil has a relatively low number of false positives because it uses a mostly dynamic analysis, because it focuses on individual problematic code locations instead of inferring sound types for the entire program, and because it uses a set of novel techniques to deal with intentionally polymorphic code. Second, optimizing JIT compilers leverage static and dynamic type inference to emit code specialized towards particular runtime types [20], [24], [36]. Instead of improving the performance of existing programs, TypeDevil warns developers about problematic code locations.…”
Section: Introductionmentioning
confidence: 99%
“…From this perspective, our type system studies the combination of variational types and gradual types. Gradual languages with type inference [Garcia and Cimini 2015;Rastogi et al 2012;Siek and Vachharajani 2008] were a large influence on migrational typing. While ITGL was used as the basis for formalizing our type system, we expect that our approach can be extended to handle other features in this line of work.…”
Section: Relation To Gradual Typingmentioning
confidence: 99%
“…Gradual type inference with flow-based typing [Rastogi et al 2012] has been explored to make programs in dynamic object-oriented languages more performant. Since our work is formalized on ITGL, our work inherits the relations between ITGL and the flow-based inference [Garcia and Cimini 2015].…”
Section: Relation To Gradual Typingmentioning
confidence: 99%
“…Other approaches include the work of Rastogi et al. [10], using local type inference to significantly reduce the number of casts that are required; the work of Herman et al [7], in which they propose to use coercions instead of proxies in a chain of higher-order cast, so as to be able to combine adjacent coercions in order to limit space consumption; the work of Siek et al [13], in which they go a step further, developing threesomes as a data structure and algorithm to represent and normalize coercions. A threesome is a cast with three positions: source, target, and an intermediate lowest type.…”
Section: Extended Abstractmentioning
confidence: 99%