Flux (machine-learning framework)

Open-source machine-learning software library From Wikipedia, the free encyclopedia

Flux (machine-learning framework)

Flux is an open-source machine-learning software library and ecosystem written in Julia.[1][6] Its current stable release is v0.15.0[4] Edit this on Wikidata. It has a layer-stacking-based interface for simpler models, and has a strong support on interoperability with other Julia packages instead of a monolithic design.[7] For example, GPU support is implemented transparently by CuArrays.jl.[8] This is in contrast to some other machine learning frameworks which are implemented in other languages with Julia bindings, such as TensorFlow.jl (the unofficial wrapper, now deprecated), and thus are more limited by the functionality present in the underlying implementation, which is often in C or C++.[9] Flux joined NumFOCUS as an affiliated project in December of 2021.[10]

Quick Facts Original author(s), Stable release ...
Flux
Original author(s)Michael J Innes,[1] Dhairya Gandhi,[2] and Contributors[3]
Stable release
0.15.0[4]  / 5 December 2024; 4 months ago (5 December 2024)
Repositorygithub.com/FluxML/Flux.jl
Written inJulia
TypeMachine learning library
LicenseMIT[5]
Websitehttps://fluxml.ai
Close

Flux's focus on interoperability has enabled, for example, support for Neural Differential Equations, by fusing Flux.jl and DifferentialEquations.jl into DiffEqFlux.jl.[11][12]

Flux supports recurrent and convolutional networks. It is also capable of differentiable programming[13][14][15] through its source-to-source automatic differentiation package, Zygote.jl.[16]

Julia is a popular language in machine-learning[17] and Flux.jl is its most highly regarded machine-learning repository[17] (Lux.jl is another more recent, that shares a lot of code with Flux.jl). A demonstration[18] compiling Julia code to run in Google's tensor processing unit (TPU) received praise from Google Brain AI lead Jeff Dean.[19]

Flux has been used as a framework to build neural networks that work with homomorphic encrypted data without ever decrypting it.[20][21] This kind of application is envisioned to be central for privacy to future API using machine-learning models.[22]

Flux.jl is an intermediate representation for running high level programs on CUDA hardware.[23][24] It was the predecessor to CUDAnative.jl which is also a GPU programming language.[25]

See also

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.