Julia is a high-level dynamic programming language designed to address the needs of high-performance numerical analysis and computational science, without the typical need of separate compilation to be fast, while also being effective for general-purpose programming, web use or as a specification language.
Distinctive aspects of Julia's design include a type system with parametric polymorphism and types in a fully dynamic programming language and multiple dispatch as its core programming paradigm. It allows concurrent, parallel and distributed computing, and direct calling of C and Fortran libraries without glue code.
Julia is garbage-collected, uses eager evaluation and includes efficient libraries for floating-point calculations, linear algebra, random number generation, fast Fourier transforms (using FFTW but only in current release versions; one of the library dependencies moved out of the standard library to a package because it is GPL licensed, and thus will not included in Julia 1.0 by default) and regular expression matching.
Video Julia (programming language)
History
Work on Julia was started in 2009 by Jeff Bezanson, Stefan Karpinski, Viral B. Shah, and Alan Edelman who set out to create a language that was both high-level and fast. On 14 February 2012 the team launched a website with a blog post explaining the language's mission. Since then, the Julia community has grown, with over 1,800,000 downloads as of January 2018. It has attracted some high-profile clients, from investment manager BlackRock, which uses it for time-series analytics, to the British insurer Aviva, which uses it for risk calculations. In 2015, the Federal Reserve Bank of New York used Julia to make models of the US economy, noting that the language made model estimation "about 10 times faster" than before (previously used MATLAB). Julia's co-founders established Julia Computing in 2015 to provide paid support, training, and consulting services to clients, though Julia itself remains free to use. At the 2017 JuliaCon conference, Jeff Reiger, Keno Fischer and others announced that the Celeste project used Julia to achieve "peak performance of 1.54 petaflop using 1.3 million threads" on 9300 Knights Landing (KNL) nodes of the Cori supercomputer (the 5th fastest in the world at the time; 8th fastest as of November 2017). Julia thus joins C, C++, and Fortran as high-level languages in which petaflop computations have been written.
A yearly academic conference on Julia, since 2014, is JuliaCon.
Maps Julia (programming language)
Language features
According to the official website, the main features of the language are:
- Multiple dispatch: providing ability to define function behavior across many combinations of argument types
- Dynamic type system: types for documentation, optimization, and dispatch
- Good performance, approaching that of statically-typed languages like C
- A built-in package manager
- Lisp-like macros and other metaprogramming facilities
- Call Python functions: use the PyCall package
- Call C functions directly: no wrappers or special APIs
- Powerful shell-like abilities to manage other processes
- Designed for parallel and distributed computing
- Coroutines: lightweight green threading
- User-defined types are as fast and compact as built-ins
- Automatic generation of efficient, specialized code for different argument types
- Elegant and extensible conversions and promotions for numeric and other types
- Efficient support for Unicode, including but not limited to UTF-8
Multiple dispatch (also termed multimethods in Lisp) is a generalization of single dispatch - the polymorphic mechanism used in common object-oriented programming (OOP) languages - that uses inheritance. In Julia, all concrete types are subtypes of abstract types, directly or indirectly subtypes of the Any type, which is the top of the type hierarchy. Concrete types can not be subtyped, but composition is used over inheritance, that is used by traditional object-oriented languages (see also inheritance vs subtyping).
Julia draws significant inspiration from various dialects of Lisp, including Scheme and Common Lisp, and it shares many features with Dylan, also a multiple-dispatch-oriented dynamic language (which features an ALGOL-like free-form infix syntax rather than a Lisp-like prefix syntax, while in Julia "everything" is an expression), and with Fortress, another numerical programming language (which features multiple dispatch and a sophisticated parametric type system). While Common Lisp Object System (CLOS) adds multiple dispatch to Common Lisp, not all functions are generic functions.
In Julia, Dylan and Fortress extensibility is the default, and the system's built-in functions are all generic and extensible. In Dylan, multiple dispatch is as fundamental as it is in Julia: all user-defined functions and even basic built-in operations like +
are generic. Dylan's type system, however, does not fully support parametric types, which are more typical of the ML lineage of languages. By default, CLOS does not allow for dispatch on Common Lisp's parametric types; such extended dispatch semantics can only be added as an extension through the CLOS Metaobject Protocol. By convergent design, Fortress also features multiple dispatch on parametric types; unlike Julia, however, Fortress is statically rather than dynamically typed, with separate compiling and executing phases. The language features are summarized in the following table:
By default, the Julia runtime must be pre-installed as user-provided source code is run, while another way is possible, where a standalone executable can be made that needs no Julia source code built with BuildExecutable.jl.
Julia's syntactic macros (used for metaprogramming), like Lisp macros, are more powerful and different from text-substitution macros used in the preprocessor of some other languages such as C, because they work at the level of abstract syntax trees (ASTs). Julia's macro system is hygienic, but also supports deliberate capture when desired (like for anaphoric macros) using the esc
construct.
Interaction
The Julia official distribution includes an interactive session shell, called Julia's read-eval-print loop (REPL), which can be used to experiment and test code quickly. The following fragment represents a sample session example where strings are concatenated automatically by println:
The REPL gives user access to the system shell and to help mode, by pressing ;
or ?
after the prompt (preceding each command), respectively. It also keeps the history of commands, including between sessions. Code that can be tested inside the Julia's interactive section or saved into a file with a .jl
extension and run from the command line by typing:
Julia is supported by Jupyter, an online interactive "notebooks" environment.
Use with other languages
Julia's ccall
keyword is used to call C-exported or Fortran shared library functions individually.
Julia has Unicode 10 support, with UTF-8 used for strings (by default) and for Julia source code, meaning allowing as an option common math symbols for many operators, such as ? for the in
operator.
Julia has packages supporting markup languages such as HTML, (and also for HTTP), XML, JSON and BSON; and for database and web use in general.
Implementation
Julia's core is implemented in Julia, C (and the LLVM dependency is in C++), assembly and its parser in Scheme ("FemtoLisp"). The LLVM compiler infrastructure project is used as the back end for generation of 64-bit or 32-bit optimized machine code depending on the platform Julia runs on. With some exceptions (e.g., PCRE), the standard library is implemented in Julia itself. The most notable aspect of Julia's implementation is its speed, which is often within a factor of two relative to fully optimized C code (and thus often an order of magnitude faster than Python or R), although these benchmark claims are often disputed. Development of Julia began in 2009 and an open-source version was publicized in February 2012.
Julia 0.6 "is now considered the stable line of releases and is recommended for most users, as it provides both language and API stability" and is on a monthly release schedule where bugs are fixed and some new features from 0.7-DEV are backported. In contrast the 0.5 release line (and older) is not is actively worked on; it only gets backported with bug fixes on an irregular schedule and older lines are no longer maintained.
Current and future platforms
While Julia uses JIT (MCJIT from LLVM) - it still means Julia generates native machine code, directly, before a function is first run (not a bytecode that is run on a virtual machine (VM) or translated as the bytecode is running, as with e.g., Java; the JVM or Dalvik in Android).
Current support is for 32- and 64-bit x86 processors (all except for ancient pre-Pentium 4-era, to optimized for newer), while Julia also supports more, e.g. "fully supports ARMv8 (AArch64) processors, and supports ARMv7 and ARMv6 (AArch32) with some caveats." Other platforms (other than those mainstream CPUs; or non-mainstream operating systems), have "Community" support, or "External" support (meaning in a package), e.g. for GPUs.
At least some platforms may need to be compiled from source code (e.g. the original Raspberry Pi), with options changed, while the download page has otherwise executables (and the source) available. Julia has been "successfully built" on several ARM platforms, up to e.g. "ARMv8 Data Center & Cloud Processors", such as Cavium ThunderX (first ARM with 48 cores). ARM v7 (32-bit) and ARM v8 (64-bit) has "Official" support and binaries (first to get after x86), while PowerPC (64-bit) has "Community" support and PTX (64-bit) (meaning Nvidia's CUDA on GPUs) has "External" support.
Julia is now supported in Raspbian while support is better for newer (e.g.) ARMv7 Pis; the Julia support is promoted by the Raspberry Pi Foundation. Support for GNU Hurd is being worked on (in JuliaLang's openlibm dependency project).
Julia2C source-to-source compiler
A Julia2C source-to-source compiler from Intel Labs is available. This source-to-source compiler is a fork of Julia, that emits C code (and makes the full Julia implementation not needed, for that generated C code) instead of native machine code, for functions or whole programs; this makes Julia effectively much more portable, as C is very portable with compilers available for most CPUs. The compiler is also meant to allow analyzing code at a higher level than C.
Intel's ParallelAccelerator.jl can be thought of as a partial Julia to C++ compiler (and then to machine code transparently), but the objective is parallel speedup (can be "100x over plain Julia", for the older 0.4 version, and could in cases also speed up serial code many fold for that version); not compiling the full Julia language to C++ (C++ is only an implementation detail, later versions might not compile to C++). It doesn't need to compile all of Julia's syntax, as the rest is handled by Julia.
Julia Computing company
Julia Computing, Inc. was founded by Viral B. Shah, Deepak Vinchhi, Alan Edelman, Jeff Bezanson, Stefan Karpinski and Keno Fischer.
In June 2017 Julia Computing raised $4.6M in seed funding from General Catalyst and Founder Collective.
See also
- Comparison of numerical analysis software
Notes
References
External links
- Official website
Source of the article : Wikipedia