profile
viewpoint
Tim Holy timholy Washington University in St. Louis St. Louis, Missouri, USA http://holylab.wustl.edu Neuroscientist and developer of the Julia language and its packages, including many developer tools and those of the @JuliaImages organization

GiovineItalia/Gadfly.jl 1602

Crafty statistical graphics for Julia.

JuliaArrays/StaticArrays.jl 347

Statically sized arrays for Julia

JuliaDebug/Debugger.jl 271

Julia debugger

GiovineItalia/Compose.jl 191

Declarative vector graphics

JuliaArrays/AxisArrays.jl 116

Performant arrays where each dimension can have a named axis with values

JuliaCI/Coverage.jl 98

Take Julia code coverage and memory allocation results, do useful things with them

JuliaDebug/JuliaInterpreter.jl 97

Interpreter for Julia code

JuliaArrays/OffsetArrays.jl 81

Fortran-like arrays with arbitrary, zero or negative starting indices.

JuliaArrays/TiledIteration.jl 54

Julia package to facilitate writing mulithreaded, multidimensional, cache-efficient code

JuliaArrays/MappedArrays.jl 46

Lazy in-place transformations of arrays

issue commentJuliaPlots/Plots.jl

series_annotations in bar plots in unusual positions for gr and pyplot backends

no, unfortunately not. The best approach is to add a new series without markers with the same data, then then apply series_annotations to that.

I'm sorry, I didn't get that... can you please elaborate?

JobJob

comment created time in 17 minutes

issue commentJuliaPlots/Plots.jl

series_annotations in bar plots in unusual positions for gr and pyplot backends

Any progress on this? As of Julia 1.6, it seems to me that series_annotations is completely broken, at least with GR

JobJob

comment created time in 21 minutes

issue commentJuliaLang/julia

Can't build Julia v1.6.0-beta1 for Arm32bit

Same error can reproduce on Docker with the following Dockerfile

FROM balenalib/raspberrypi3:buster-20200502

# insta dependencies
RUN apt-get update && \
    apt-get install -y build-essential libatomic1 python gfortran perl wget m4 cmake pkg-config \
    git && \
    apt-get clean && rm -rf /var/cache/apt/archives/* /var/lib/apt/lists/*

# build julia from source
ARG JL_VERSION="v1.6.0-beta1"
ARG WDIR=/home/pi/work
ARG JL_BUILD_DIR=$WDIR/build
WORKDIR $WDIR
RUN echo "\
CXXFLAGS=-D_GLIBCXX_USE_CXX11_ABI=0\n\
prefix=/home/pi/julia-$JL_VERSION\n\
USE_BINARYBUILDER=1\n\
LDFLAGS=-latomic\n\
CFLAGS += "-mfpu=neon-vfpv4"\n\
CXXFLAGS += "-mfpu=neon-vfpv4"\n\
MARCH="armv7-a"\n\
JULIA_CPU_TARGET=\"armv7-a\;armv7-a,neon\;armv7-a,neon,vfp4\"\n\
JULIA_CPU_THREADS=4\n\
" > Make.user && \
    cat Make.user && \
    git clone --depth=1 -b $JL_VERSION https://github.com/JuliaLang/julia.git $JL_BUILD_DIR &&\
    cp Make.user $JL_BUILD_DIR && \
    cd $JL_BUILD_DIR && make -j 16 && make install && \
    echo "clean up $JL_BUILD_DIR" && \
    rm -r $JL_BUILD_DIR && \
    echo "Done"

# add path of Julia
ENV PATH=/home/pi/julia-$JL_VERSION/bin:$PATH
# runtime test
RUN julia -e "using InteractiveUtils; versioninfo()"
CMD ["julia"]
terasakisatoshi

comment created time in 40 minutes

issue openedJuliaLang/julia

Can't build Julia v1.6.0-beta1 for Arm32bit

I'm trying to build Julia for arm32bit system on my RaspberryPi4 8GB, but failed with the following error:

...
...
...
In file included from /home/pi/work/julia/src/llvm-multiversioning.cpp:29:
/home/pi/work/julia/src/julia_internal.h:119:6: warning: #warning is a GCC extension
     #warning No cycleclock() definition for your platform
      ^~~~~~~
/home/pi/work/julia/src/julia_internal.h:119:6: warning: #warning No cycleclock() definition for your platform [-Wcpp]
    CC src/llvm-alloc-opt.o
In file included from /home/pi/work/julia/src/llvm-alloc-opt.cpp:33:
/home/pi/work/julia/src/julia_internal.h:119:6: warning: #warning is a GCC extension
     #warning No cycleclock() definition for your platform
      ^~~~~~~
/home/pi/work/julia/src/julia_internal.h:119:6: warning: #warning No cycleclock() definition for your platform [-Wcpp]
    CC src/cgmemmgr.o
In file included from /home/pi/work/julia/src/cgmemmgr.cpp:8:
/home/pi/work/julia/src/julia_internal.h:119:6: warning: #warning is a GCC extension
     #warning No cycleclock() definition for your platform
      ^~~~~~~
/home/pi/work/julia/src/julia_internal.h:119:6: warning: #warning No cycleclock() definition for your platform [-Wcpp]
    CC src/llvm-api.o
    CC src/llvm-remove-addrspaces.o
    CC src/llvm-remove-ni.o
    CC src/llvm-julia-licm.o
    CC src/llvm-demote-float16.o
    LINK usr/lib/libjulia-internal.so.1.6
    JULIA usr/lib/julia/corecompiler.ji
Segmentation fault
make[1]: *** [sysimage.mk:61: /home/pi/work/julia/usr/lib/julia/corecompiler.ji] Error 139
make: *** [Makefile:82: julia-sysimg-ji] Error 2

Here is what I did:

# on my raspberrypi4 8gb
$  git status
HEAD detached at v1.6.0-beta1
$ cat Make.user
LDFLAGS=-latomic
$ make

we can see full output log -> https://gist.github.com/terasakisatoshi/50e9c3e6814c3a14297e6c6f62d2ec6d

It seems make julia-sysimg-ji failed because of we can't create/find corecompiler.jl at ./usr/lib/julia.

pi@rpi4:~/work/julia $ cd usr/lib/julia/
pi@rpi4:~/work/julia/usr/lib/julia $ ls
pi@rpi4:~/work/julia/usr/lib/julia $ $ <--- nothing happens. it is empty.

How do we resolve this issue?

created time in 44 minutes

CommitCommentEvent

push eventJuliaLang/julia

Mark Kittisopikul

commit sha 26a721b28ad56c006957ee1f2de083befcfe7904

Prepend build_bindir to LLVM_CONFIG_PATH_FIX rather than append (#39275)

view details

push time in an hour

PR merged JuliaLang/julia

Prepend build_bindir to LLVM_CONFIG_PATH_FIX rather than append

Summary

Prepend rather than append $(build_bindir) to the PATH for LLVM_CONFIG_PATH_FIX on Windows to avoid using libraries from an older Julia install.

Background

Windows requires that the environmental variable PATH include the build_bindir. Currently this is appended to the PATH making the build_bindir the the lowest priority directory to search.

Also on Windows, the installer for a Julia release will append the bin directory to the PATH.

This creates a scenario where some tools may use libraries in the binary release rather than libraries in the current source tree. For example this occurs with usr/tools/llvm-config.exe when using cygwin. In the example below, llvm-config.exe is dynamically linked to LLVM.dll in a Julia 1.5.0 install despite trying to build Juila master (1.7.0-DEV) causing llvm-config.exe to silently fail with a return code of 127:

$ cat VERSION
1.7.0-DEV

$ ldd usr/tools/llvm-config.exe
        ntdll.dll => /cygdrive/c/WINDOWS/SYSTEM32/ntdll.dll (0x7ffc5e420000)
        KERNEL32.DLL => /cygdrive/c/WINDOWS/System32/KERNEL32.DLL (0x7ffc5d010000)
        KERNELBASE.dll => /cygdrive/c/WINDOWS/System32/KERNELBASE.dll (0x7ffc5b570000)
        msvcrt.dll => /cygdrive/c/WINDOWS/System32/msvcrt.dll (0x7ffc5cef0000)
        libstdc++-6.dll => /cygdrive/c/Users/kittisopikulm/AppData/Local/Programs/Julia 1.5.0/bin/libstdc++-6.dll (0x6fc40000)
        LLVM.dll => /cygdrive/c/Users/kittisopikulm/AppData/Local/Programs/Julia 1.5.0/bin/LLVM.dll (0x66740000)

$ usr/tools/llvm-config.exe

$ echo $?
127

$ echo $PATH
/usr/local/bin:/usr/bin:/cygdrive/c/Users/kittisopikulm/AppData/Local/Programs/Julia 1.5.0/bin:/cygdrive/c/Users/kittisopikulm/AppData/Local/Programs/Julia-1.6.0-beta1/bin

$ PATH="usr/bin:$PATH" usr/tools/llvm-config.exe
usage: llvm-config <OPTION>... [<COMPONENT>...]
Get various configuration information needed to compile programs which use
LLVM.  Typically called from 'configure' scripts.  Examples:
...

$ PATH="usr/bin:$PATH" ldd usr/tools/llvm-config.exe
        ntdll.dll => /cygdrive/c/WINDOWS/SYSTEM32/ntdll.dll (0x7ffc5e420000)
        KERNEL32.DLL => /cygdrive/c/WINDOWS/System32/KERNEL32.DLL (0x7ffc5d010000)
        KERNELBASE.dll => /cygdrive/c/WINDOWS/System32/KERNELBASE.dll (0x7ffc5b570000)
        msvcrt.dll => /cygdrive/c/WINDOWS/System32/msvcrt.dll (0x7ffc5cef0000)
        libstdc++-6.dll => /cygdrive/c/Users/kittisopikulm/source/repos/julia/usr/bin/libstdc++-6.dll (0x6fc40000
)
        LLVM.dll => /cygdrive/c/Users/kittisopikulm/source/repos/julia/usr/bin/LLVM.dll (0x66740000)
...

The proposed fix is simple. Prepend rather than append $(build_binddir) so that $(build_binddir) is searched for the relevant for the relevant libraries first.

Further fixes may involve checking the return code of of llvm-config.exe to ensure that execution has not failed silently. Currently a problem is only detected when parsing "libllvm_version" in base/version.jl. In this case Julia attempts to parse an empty string, which fails.

+1 -1

0 comment

1 changed file

mkitti

pr closed time in an hour

issue openedJuliaLang/julia

@everywhere is slow on HPC with multi-node environment

https://github.com/JuliaLang/julia/blob/7647ab574fba6460877d0c1c571a86fdf18b5d31/stdlib/Distributed/src/macros.jl#L207

Please check here for descriptions

https://discourse.julialang.org/t/everywhere-takes-a-very-long-time-when-using-a-cluster/35724

I found that increasing nworkers causes the execution time (seen from master) to increase linearly. Perhaps it is somehow serialized ?

created time in 2 hours

pull request commentJuliaGPU/CUDA.jl

Backport new CUDNN interface to 1.5

Regarding the RNN design, I'm unclear whether the plugging back the CUDNN would currently be right course given the differently designed approach taken in Flux, which only defines recurrent cells at the individual level. This results in a model being broadcasting on a sequence m.(x) rather than taking the full 3D input as expected by CUDNN.

That's interesting. Though a more appropriate Julia metaphor for RNNs would be not map but reduce would it not? There is a changing state every time step and at the end you usually want to reduce a sequence (of vectors) to a single vector. In applications where the intermediate steps matter, these can be thought of akin to e.g. cumulative sums.

So under current Flux approach to RNN, the CUDNN functionalities were not really leveraged as it was called for each individual step of the sequence, which explained from my understanding why resorting to base CUDA.jl implementation resulted in similar performance (along solving some pending issues).

This is not surprising because most of the gain for a GPU rnn comes from being able to process the whole sequence in parallel.

All this to say that while I see as desirable to have an approach to RNN that matches that of CUDNN with 3D input, I think it would involve some departure from the "single step" design. Otherwise, I'm not sure there's any benefit integrating the CUDNN RNN if the Flux RNN design isn't moved to that 3D structure. @denizyuret, was your intent to revamp Flux RNN in that direction?

I think for research use it is essential to support cudnn features like 3D input, multiple layers, easy bidirectionality etc. A "single step" design is useful for demonstration and teaching but not for most large scale research. NNlib should probably extended with such an interface that also has the supporting data structures like PaddedSequence etc. Flux can choose to support both the old and the new interfaces. As to how to proceed in Flux I am not sure how the decision process works. Should @DhairyaLGandhi or @CarloLucibello have the final say?

As an example I have had an RNN interface in Knet that supports 3D, multilayer, bidirectional etc. since cudnn introduced them:

https://github.com/denizyuret/Knet.jl/blob/master/src/ops20/rnn.jl

Though nobody is completely happy with my interface, @ekinakyurek wrote another:

https://github.com/ekinakyurek/KnetLayers.jl/blob/master/src/rnn.jl

These days I am looking at PyTorch/TensorFlow/MXnet/ONNX etc to see what different API ideas are out there, and probably will design a new interface under Knet.Layers21, keeping the old one for backward compatibility.

maleadt

comment created time in 2 hours

issue commentPainterQubits/Unitful.jl

Addition of temperatures revisited

While this issue is under discussion, is there something I can do to get the following behavior?

Statistics.mean([1u"°C", 2u"°C"]) == 1.5u"°C"
Balinus

comment created time in 2 hours

pull request commentJuliaLang/julia

Export oneto rather than implement range(stop)

In my view, if the possibility of returning a OneTo simply and straight-forwardly "fell out" of our definition of range then we might as well return it

The least controversial road to OneTo from range is via range(; stop) or range(; length) as in #39241. At least there you have a clearly designated property and you can assume both start and step are 1, so there is no ambiguity.

One positional argument is more challenging and controversial since you have to route from range(start; stop, length, step) and switch start to stop if we wanted to allow Floats as well as Integers. The method table ends up looking a bit strange because you never quite see stop as the first argument when you examine the output of methods(range). It could be simplified if range(stop::Integer) was the only single positional argument we would allow, but that's awkward because the only range argument that has to be an Integer otherwise is length. Because of all these issues, exporting oneto makes more sense to me and maybe less controversial.

range(start; stop=nothing, length::Union{Integer,Nothing}=nothing, step=nothing) =
    _range_positional(start, step, stop, length)

...

range(stop::Integer) = range_stop(stop)

_range_positional(stop::Any    , step::Nothing,      ::Nothing, len::Nothing) =
    _range(nothing, nothing, stop, nothing) # One arg interpreted as `stop`, could be nothing
_range_positional(start::Any    , step::Any    , stop::Any,     len::Any) =
    _range(start, step, stop, len)

...

range_stop(stop) = oneunit(stop):stop
range_stop(stop::Integer) = OneTo(stop)
julia> methods(range)
# 3 methods for generic function "range":
[1] range(stop::Integer) in Main at REPL[296]:1
[2] range(start; stop, length, step) in Main at REPL[295]:1
[3] range(start, stop; length, step) in Main at REPL[156]:1

julia> range(stop) = range_stop(stop) # Simpler implementation, but messier method table below
range (generic function with 3 methods)

julia> methods(range)
# 3 methods for generic function "range":
[1] range(stop::Integer) in Main at REPL[296]:1
[2] range(stop; stop, length, step) in Main at REPL[302]:1 # That looks messy
[3] range(start, stop; length, step) in Main at REPL[156]:1

https://github.com/JuliaLang/julia/blob/a03945e518c36837d99170a66342d00ab8de64ab/base/range.jl

I'll consider resubmitting a one positional argument range PR after the two and three positional argument range is merged as @mbauman suggested .

mkitti

comment created time in 2 hours

issue commentJuliaLang/julia

Missed stackoverflow on large tuple due to lack of stack probing

Seems related to #28577 — we discussed probe-stack there as well.

tkf

comment created time in 2 hours

CommitCommentEvent

PR opened JuliaArrays/StaticArrays.jl

Fix tests from #868

Strangely enough the tests from #868 were broken with a misplaced paren, but CI appeared to have passed on that PR. No idea what happened there...

+2 -2

0 comment

1 changed file

pr created time in 3 hours

create barnchJuliaArrays/StaticArrays.jl

branch : cjf/fixup-fix-undev-newsize

created branch time in 3 hours

created tagJuliaPackaging/Preferences.jl

tagv1.2.1

Project Preferences Package

created time in 3 hours

release JuliaPackaging/Preferences.jl

v1.2.1

released time in 3 hours

issue commentJuliaPackaging/Preferences.jl

TagBot trigger issue

Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/28103

JuliaTagBot

comment created time in 3 hours

CommitCommentEvent
CommitCommentEvent

delete branch JuliaPackaging/Preferences.jl

delete branch : dpa/32bit

delete time in 4 hours

push eventJuliaPackaging/Preferences.jl

Dilum Aluthge

commit sha cd6bf39b05b70f6f8d4dfd551a1eebeff3c41af1

Run tests with both 32-bit and 64-bit binaries (#12) * Run tests with both 32-bit and 64-bit binaries * Remove a comment * Bump patch version * Add "contributors" to authors

view details

push time in 4 hours

issue closedJuliaPackaging/Preferences.jl

TODO: Re-enable CI with 32-bit Julia binaries

In https://github.com/JuliaPackaging/Preferences.jl/pull/6, I temporarily disabled CI with 32-bit Julia binaries because tests were always failing (https://github.com/JuliaPackaging/Preferences.jl/issues/8).

After https://github.com/JuliaPackaging/Preferences.jl/issues/8 is fixed, we should re-enable CI with 32-bit Julia binaries.

closed time in 4 hours

DilumAluthge

push eventJuliaPackaging/Preferences.jl

Dilum Aluthge

commit sha 6266bdc95cfc032ab5a8ea98455279d4ae5ab106

Add "contributors" to authors

view details

push time in 4 hours

push eventJuliaPackaging/Preferences.jl

Dilum Aluthge

commit sha 99fa68a52c46aae06ed96adc11c50126226a5bb3

Bump patch version

view details

push time in 4 hours

pull request commentJuliaPackaging/Preferences.jl

Run tests with both 32-bit and 64-bit binaries

Codecov Report

Merging #12 (54ecb30) into master (390b319) will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master      #12   +/-   ##
=======================================
  Coverage   76.69%   76.69%           
=======================================
  Files           2        2           
  Lines         103      103           
=======================================
  Hits           79       79           
  Misses         24       24           

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 390b319...ec62054. Read the comment docs.

DilumAluthge

comment created time in 4 hours

push eventJuliaPackaging/Preferences.jl

Dilum Aluthge

commit sha ec6205410e157c686701eca1e6323c62bda994da

Remove a comment

view details

push time in 4 hours

more