profile
viewpoint
Aarni Koskela akx @valohai Finland https://akx.github.io/ Programmer generalist and general enthusiast.

6aika/issue-reporting 6

Reference Open311 API server implementation

akx/autotypes 6

Finds non-stub TypeScript @types/ packages for your package.json.

akx/afsp-mirror 2

Mirror of AFsp (Audio File Programs and Routines)

akx/arpy 2

A Web MIDI arpeggiator/step sequencer thing.

akx/ant1gravity-hearthstone-tier-lists 1

Parses Ant1gravity's Hearthstone tier list spreadsheets into a more greppable form.

akx/asdfsanat 1

find paths and chains of finnish words, e.g. pala - ala - sala

6aika/rest-validator 0

6Aika Rest API validator

akx/100-ideaa 0

Pikku Kakkosen 100 ideaa puhuvana arpomiskoneena.

akx/abyss 0

Python statistical profiler

akx/acmetool 0

:lock: acmetool, an automatic certificate acquisition tool for ACME (Let's Encrypt)

startedykarikos/sheet-music

started time in 4 hours

startedsmore-inc/clippy.js

started time in 8 hours

push eventakx/listeningclub

Aarni Koskela

commit sha 689599018fd98742fe81dcacbce59d477ce5bc41

Updoot

view details

push time in 20 hours

pull request comment6aika/issue-reporting

Modern Django

Yeah, this project just came up in the https://koodiklinikka.fi/ Slack.

I figured I could give a bit of a spring... uh... fall cleaning. :)

akx

comment created time in 3 days

pull request commentPalPalash/hackathon_whjr-covid-calculator

Cleanup

@PalPalash Rebased. Feel free to merge.

akx

comment created time in 3 days

push eventakx/hackathon_whjr-covid-calculator

PalPalash

commit sha 2ef6abda3bc451926e7e9c8e96efced700e4ae77

commit

view details

PalPalash

commit sha 3353a0ce131a5daef4cae550b0cc2cee25b98e5e

commit

view details

Aarni Koskela

commit sha 7092c5b492dda9e43459c0655cd071a6f3bf43d6

get-pip doesn't belong in the repo

view details

Aarni Koskela

commit sha 232d9617dbb7ee7d6666476f1d227e547fef890c

Use standard requirements.txt

view details

Aarni Koskela

commit sha 06eb51554e47a1f0cea63c6c19d34da47f6767e5

Fix myTraining.py

view details

Aarni Koskela

commit sha b888ee0b507406c33d20541d9e65ccba50fd0c6c

Fix main.py to do something useful

view details

push time in 3 days

PR opened PalPalash/hackathon_whjr-covid-calculator

Cleanup

Based on https://stackoverflow.com/questions/63950831/python-code-not-working-pandas-pickle-clf here's a pull request that fixes things to work to some degree.

  • The get-pip script shouldn't be in the repo.
  • Requirements are now listed in requirements.txt and the installer scripts and instructions are updated accordingly
  • The training script now uses standard Scikit-learn train/test split and successfully saves the data
  • The headers in the data CSV are fixed to have no spaces
  • main.py now outputs the probability result, not just "hello world".

Next steps would be refactoring the feature engineering into a reusable module (so one doesn't have to guess which order the features were in in the original data when inferencing), and adding proper input handling to the web interface.

Also, the project doesn't seem to need pickle-mixin for anything, but I didn't remove it just yet.

+56 -23731

0 comment

10 changed files

pr created time in 3 days

create barnchakx/hackathon_whjr-covid-calculator

branch : cleanup

created branch time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha c23de5b64d146a1cf728f6c10f09a7ba93fafa3f

Fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 8abb210c396629b812ea1b863837a06cd060dfa2

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 947a34d834257f8007ee1c536eb386b05eb068f8

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 71955e1079c7460f11c65df723f23e3c16f27ede

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha d706e362f5e96aa089de1a885d31be91ca28ddbb

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 6f1b9ebf2499406354b44be75ef8cc5733672332

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha a34f3a78c7ef10dc93cdcd804065a5854852fe7e

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha fd72c8c87db96d3df130ff18f5b0f3b86714f62a

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 7857c86d977b8a764e541e5b25851b59987ddf73

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha bd5fc632482fa9401e53615d14cb61d05af3f023

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha e122ebe8d2518dbfdaf897203690af25a0ae34f6

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha aad404eca13b5eac870b464c47e1749140aea31c

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 27c83411ece5a3afb6118224db445d21ed392498

Try to fix Travis build

view details

push time in 3 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 8e03b07fc2aca2773add5d604c929b213fce4492

Fix some custom database field issues

view details

Aarni Koskela

commit sha 9941cfdb3fd5de804ad2acc996242e186383474d

Try to fix Travis build

view details

push time in 3 days

startedktodisco/colorspace

started time in 4 days

push event6aika/issue-reporting

Aarni Koskela

commit sha d290e5bdea2e60dc36cc5bafba308bff25c188e7

Modernize test setup some

view details

Aarni Koskela

commit sha 9db62dd01795056c2630d037209721924120222e

Fix usage of is_authenticated

view details

Aarni Koskela

commit sha f7767b28f9aedd0adf0ca98f9da42e792f2925ef

GeoJSON: return None for falsy values

view details

Aarni Koskela

commit sha 272b14ac5b0d889bba2da0c1d14432d9149315f6

Note Python 3.6+ only compatibility

view details

Aarni Koskela

commit sha 1d72031f341a9b84319f58a6963793abbde709d3

Run autoflake

view details

Aarni Koskela

commit sha 0e1abbafc033b24d47cbc41e3628d23ec05ec3bf

Try to fix Travis build

view details

Aarni Koskela

commit sha d3a3ebe306c9fc19eeb198141e7afd8867968025

Fix some custom database field issues

view details

push time in 4 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 5e2a2b31b119e343b84d6cff30be4902b39f6019

Try to fix Travis build

view details

push time in 4 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 45d8142cf2ea65a20b04e56129773e3fd1875923

Try to fix Travis build

view details

push time in 4 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 32996a33a2f0ad727cec789bb796e40486163177

Modernize .travis.yml

view details

push time in 4 days

PR opened 6aika/issue-reporting

Modern Django
  • Refresh requirements for Django 3.0+
  • Run Pyupgrade and Flynt
  • Fix all the things that have been or will be deprecated or didn't work

⚠️ After this PR is merged, the project requires Python 3.6 or newer. That said, older versions are end-of-life anyway...

+255 -286

0 comment

82 changed files

pr created time in 4 days

push event6aika/issue-reporting

Aarni Koskela

commit sha 4f1e122543c5d17d2fa05eabf7e75b1aa9c5eb2a

Note Python 3.6+ only compatibility

view details

Aarni Koskela

commit sha a79fd7cc6b4f202cb043e7134debbef86afeb004

Run autoflake

view details

push time in 4 days

create barnch6aika/issue-reporting

branch : modern-django

created branch time in 4 days

created tag6aika/issue-reporting

tagv0.1

Reference Open311 API server implementation

created time in 4 days

release 6aika/issue-reporting

v0.1

released time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment

Considering all the below requests, could a simpler shape for this be

      - name: deploy
        type: deployment
        deployment: MyDeployment
        endpoints:
          - predict-digit

or am I missing something?

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment

I assume a deployment with this name needs to pre-exist. That's fine.

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment+          version:+            name: 2000102.0

This can't be static, since attempting to create a version with the same name as a pre-existing version would fail, and there's no such concept as updating a deployment version's configuration; they're very much immutable.

If left empty, we could use the same logic we use for suggesting a new version name in the UI (i.e. generate a YYYYMMDD.X version).

I do foresee a need for changing that logic to allow for e.g. a prefix – should we allow some placeholders here? Something like foo-{datestamp}.{counter}? (@ruksi, ideas?)

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment+          version:+            name: 2000102.0+            branch: any

Unnecessary.

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment+          version:+            name: 2000102.0+            branch: any+            commit: none

Why is this required here for deployment nodes if it's not required for execution nodes? (Execution nodes use the commit the blueprint originates from by default; the same should occur here.)

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment+          version:+            name: 2000102.0+            branch: any+            commit: none+            endpoints:+              - name: predict-digit+                files:+                  - name: model

... and if it refers to an existing endpoint, that endpoint specifies these, not the pipeline step.

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment+          version:+            name: 2000102.0+            branch: any+            commit: none+            endpoints:+              - name: predict-digit

This must refer to an existing endpoint: in the YAML file. This YAML file doesn't seem to be declaring any endpoints?

magdapoppins

comment created time in 4 days

Pull request review commentvalohai/valohai-yaml

Allow deployments in pipelines

       - [batch2.input.training-labels*, train.input.training-labels]       - [batch2.metadata.optimal_learning_rate, train.parameter.learning_rate]       - [batch2.output.processed-images*, train.input.training-images]+- pipeline:+    name: My deployment pipeline+    nodes:+      - name: batch1+        type: execution+        step: Batch feature extraction+      - name: train+        type: execution+        step: Train model+      - name: deploy+        type: deployment+        deployment:+          name: MyDeployment+          version:+            name: 2000102.0+            branch: any+            commit: none+            endpoints:+              - name: predict-digit+                files:+                  - name: model+    edges:+      #  node.entity_type.entity_name    node.entity_type.entity_name+      - [batch1.input.training-labels*, train.input.training-labels]+      - [batch1.metadata.optimal_learning_rate, train.parameter.learning_rate]+      - [batch1.output.processed-images*, train.input.training-images]+      - [train.output.model, deploy.file.model]

If there are multiple endpoints...

      - [train.output.model, deploy.file.predict-digit.model]
magdapoppins

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

startedEpistasisLab/tpot

started time in 5 days

issue closedpython-babel/babel

Currencies dict is returning currencies display text in lowercase for Locale pt-PT

Currencies dict is returning currencies display text in lowercase for Locale pt-PT I'm using Babel==2.8.0 It was working fine on Babel==2.3.4

Example:

Working fine in de-DE

from babel.core import default_locale, Locale
l = Locale('de', 'DE')

l.currencies('EUR')
Out: 'Euro'

l.currencies['USD']
Out: 'US-Dollar'

l.currencies['GBP']
Out: 'Britisches Pfund'

l.currencies['MZN']
Out: 'Mosambikanischer Metical'

If the locale is set to pt-PT it returns the currency text in lowercase

from babel.core import default_locale, Locale
l = Locale('pt', 'PT')

l.currencies['EUR']
Out: 'euro'

l.currencies['USD']
Out: 'dólar dos Estados Unidos'

 l.currencies['GBP']
Out: 'libra esterlina britânica'

l.currencies['MZN']
Out: 'metical moçambicano'

closed time in 6 days

peca11

issue commentpython-babel/babel

Currencies dict is returning currencies display text in lowercase for Locale pt-PT

This is as expected, as the spelling for currencies in many languages, apparently including Portuguese, is lower-case. (You can see that 'USD' does have the "Estados Unidos", "United States", capitalized).

You'll find a similar behavior for e.g. my native Finnish (fi-FI).'

Please feel free to reopen if you have further questions.

peca11

comment created time in 6 days

push eventakx/d2d

Aarni Koskela

commit sha 4814c29dc525c03e7af1349fe10f66614a9bca73

Add Billtag sample

view details

push time in 6 days

push eventakx/listeningclub

Aarni Koskela

commit sha c26b53e7135314c3c377b2a09cefe233a6b7cf6c

Update

view details

push time in 7 days

startedthreemachines/obliqueMOTD

started time in 7 days

push eventakx/pipimi

Aarni Koskela

commit sha 90ceed19ca6b6976b19a57a6e76b75d46d672ce9

Emfasten

view details

Aarni Koskela

commit sha 819ee1e3a278568408bfe2ae2d248de887efafae

Add --show-constraints

view details

push time in 10 days

push eventakx/pipimi

Aarni Koskela

commit sha 6a3b451abbc384075e02f6956d36312284fdbeb7

Fix a thing or two

view details

push time in 10 days

create barnchakx/pipimi

branch : master

created branch time in 10 days

created repositoryakx/pipimi

created time in 10 days

create barnchakx/so63831434

branch : master

created branch time in 10 days

created repositoryakx/so63831434

created time in 10 days

push eventakx/kuntaliitto-scrape

Aarni Koskela

commit sha cf19dff96aaa63f3594f09118571c46cf5b404d8

Initial commit

view details

push time in 10 days

create barnchakx/kuntaliitto-scrape

branch : master

created branch time in 10 days

created repositoryakx/kuntaliitto-scrape

created time in 10 days

PR opened valohai/valohai-yaml

Add linting for parameter defaults

Refs valohai/valohai-cli#128

+84 -4

0 comment

5 changed files

pr created time in 12 days

create barnchvalohai/valohai-yaml

branch : validate-default

created branch time in 12 days

issue commenthzdg/django-enumfields

faced this while trying to upgrade a django 1.8 project to django 2.2/3.0

from django_enumfield import enum

I only just noticed you're not even using this project. This project's module is called enumfields, not django_enumfield.

auvipy

comment created time in 12 days

issue commenthzdg/django-enumfields

faced this while trying to upgrade a django 1.8 project to django 2.2/3.0

class Labels, not class labels or class levels.

auvipy

comment created time in 13 days

issue commentvalohai/valohai-cli

Some default string values in valohai.yaml pass the linting, but fail as internal error

There're two options here – either actually lint default against the type and fail hard when the type is incorrect, or quietly translate e.g. dates such as that into strings...

JuhaKiili

comment created time in 14 days

push eventakx/react-wheely

Aarni Koskela

commit sha 491f3046ce35ab04cbfef143c5c537ea6f75541b

Update metadata bits

view details

Aarni Koskela

commit sha 541e6c09e56de989539ad87e19110431019338ce

Update dependencies (incl. Storybook 6)

view details

push time in 14 days

issue commenthzdg/django-enumfields

faced this while trying to upgrade a django 1.8 project to django 2.2/3.0

The correct syntax for enumfields.Enum labels is

    class Labels:
        RED = 'A custom label'

instead of

    labels = {
        RED: 'A custom label'
    }

as you seem to be using.

auvipy

comment created time in 14 days

push eventvalohai/valohai-cli

magda

commit sha 9341e4803e62f12d22893e57bdbbd11d9655f407

Pipeline runs from cli Fixes #125

view details

magda

commit sha 8ca43e3cbd4bd7c657d8f5e157208f13637da75e

Clean up imports of click.BadParameter

view details

Magda Stenius

commit sha 6658770b193400346fa28b4942088da4591bae88

Merge branch 'master' into pipelines

view details

Aarni Koskela

commit sha 988c7a2b415ad244379ce8af4e4ca0ff74ea5b36

Merge pull request #126 from valohai/pipelines Pipeline runs

view details

push time in 14 days

delete branch valohai/valohai-cli

delete branch : pipelines

delete time in 14 days

PR merged valohai/valohai-cli

Pipeline runs
  • Add possibility to kick off a pipeline run by it's name completely based on the valohai.yaml configuration default values

Fixes #125

+430 -7

1 comment

13 changed files

magdapoppins

pr closed time in 14 days

issue closedvalohai/valohai-cli

Allow running pipelines

  • Make it possible to run a pipeline from a blueprint in the YAML file

closed time in 14 days

akx
PullRequestReviewEvent

pull request commentpercipient/django-allauth-2fa

Simplify TwoFactorAuthenticate (less copy-paste)

@clokep Rebased once again. @greyhare As the author of #103, you might be interested in this approach too.

akx

comment created time in 17 days

push eventvalohai/django-allauth-2fa

Illia Volochii

commit sha 958dbc3a959b9b35d5c0a52b341ff8bdd918ed5a

Stop restricting a class of an adapter in `TwoFactorAuthenticate`

view details

Patrick Cloke

commit sha 3dbff6acc6dc56b476489358bf31932dada003cc

Merge pull request #96 from illia-v/patch-1

view details

Erwin Junge

commit sha 29ec6e84d463fb4fda751e0b5e5bd64bac0c4761

Use same base template as upstream allauth

view details

Patrick Cloke

commit sha f1160d84d95f7dfef5edbf71e50976f631eb1302

Merge pull request #98 from ErwinJunge/allauth-base-template Use same base template as upstream allauth

view details

Aarni Koskela

commit sha a99eb18b56de2a5be7eeeaa80892b37e0362ceb3

Simplify TwoFactorAuthenticate by having the adapter check for the OTP device session field

view details

push time in 17 days

issue commentpercipient/django-allauth-2fa

Documentation error which can cause TFA bypass

django-otp (used by this library) also sets a session key to denote a session has been OTP'd. That should be also checked by middleware/decorators...

https://github.com/django-otp/django-otp/blob/cd8429869ff580d3ae70da0cba3f615d95fcc6a6/src/django_otp/init.py#L25-L27

pjensen000

comment created time in 17 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

+import click+from click import BadParameter++from valohai_cli.utils import match_prefix+++def build_nodes(commit, config, pipeline):+    nodes = []+    for node in pipeline.nodes:+        step = config.steps.get(node.step)+        template = build_node_template(commit, step)+        nodes.append({+            "name": node.name,+            "type": node.type,+            "template": template,+        })+    return nodes+++def build_node_template(commit, step):+    template = {+        "commit": commit,+        "step": step.name,+        "image": step.image,+        "command": step.command,+        "inputs": {+            key: step.inputs[key].default for key in list(step.inputs)+        },+        "parameters": {+            key: step.parameters[key].default for key in+            list(step.parameters)+        },+        "inherit_environment_variables": True,+        "environment_variables": {+            envvar.key(): envvar.value() for envvar in step.environment_variables+        }+    }+    if step.environment:+        template["environment"] = step.environment+    return template+++def build_edges(pipeline):+    return list({+        "source_node": edge.source_node,+        "source_key": edge.source_key,+        "source_type": edge.source_type,+        "target_node": edge.target_node,+        "target_type": edge.target_type,+        "target_key": edge.target_key,+    } for edge in pipeline.edges)+++def match_pipeline(config, pipeline_name):+    """+    Take a pipeline name and try and match it to the configs pipelines.+    Returns the match if there is only one option.+    """+    if pipeline_name in config.pipelines:+        return pipeline_name+    matching_pipelines = match_prefix(config.pipelines, pipeline_name, return_unique=False)+    if not matching_pipelines:+        raise BadParameter(+            '"{pipeline}" is not a known pipeline (try one of {pipelines})'.format(+                pipeline=pipeline_name,+                pipelines=', '.join(click.style(t, bold=True) for t in sorted(config.pipelines))+            ), param_hint='pipeline')+    if len(matching_pipelines) > 1:+        raise BadParameter(

Sure. I think a single import click is better than import click; from click import ... :)

magdapoppins

comment created time in 18 days

PullRequestReviewEvent

Pull request review commentvalohai/valohai-cli

Pipeline runs

+from pprint import pprint+

Seems unused?

magdapoppins

comment created time in 18 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

+import click+from click import BadParameter
magdapoppins

comment created time in 18 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

     }, } +PIPELINE_DATA = {+    'counter': 21,+    'creator': {'id': 1, 'username': 'magda'},+    'ctime': '2020-09-02T09:14:41.216963Z',+    'deleted': False,+    'edges': [{'id': '01744e18-c8cf-4ed9-0a44-412f65b0f224',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*train-images*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'training-set-images',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8e3-492d-135e-a1ef237c4a94',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*train-labels*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'training-set-labels',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8e6-e27d-442b-b820c8975a37',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*test-images*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'test-set-images',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8e9-924a-7f89-d4033b0bd1a4',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*test-labels*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'test-set-labels',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8ec-d09c-4e5e-c749fbbf1e80',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': 'model*',+               'source_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'source_type': 'output',+               'target_key': 'model',+               'target_node': '01744e18-c8c9-167b-c2b6-a9f06d0c1735',+               'target_type': 'input'}],+    'id': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+    'log': [{'ctime': '2020-09-02T09:14:41.375499Z',+             'identifier': 'node_started',+             'kind': 'other',+             'message': 'Node "preprocess" started'},+            {'ctime': '2020-09-02T09:14:41.271304Z',+             'identifier': 'started',+             'kind': 'other',+             'message': 'Pipeline started'}],+    'nodes': [{'execution': None,+               'id': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'log': [],+               'name': 'train',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'status': 'created',+               'type': 'execution'},+              {'execution': None,+               'id': '01744e18-c8c9-167b-c2b6-a9f06d0c1735',+               'log': [],+               'name': 'evaluate',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'status': 'created',+               'type': 'execution'},+              {'execution': {'counter': 24,+                             'creator_name': 'magda',+                             'ctime': '2020-09-02T09:14:41.357081Z',+                             'cumulative_metadata': None,+                             'deleted': False,+                             'duration': None,+                             'end_time': None,+                             'id': '01744e18-c946-53e7-5780-03267649539d',+                             'n_alerts': 0,+                             'n_comments': 0,+                             'n_outputs': 0,+                             'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+                             'project': '017433ed-d913-7f38-610d-6121caf6d31c',+                             'status': 'queued',+                             'step': 'Preprocess dataset (MNIST)',+                             'tags': [],+                             'task': None,+                             'task_counter': None,+                             'title': '',+                             'url': 'http://127.0.0.1:8000/api/v0/executions/01744e18-c946-53e7-5780-03267649539d/',+                             'urls': {+                                 'copy': 'http://127.0.0.1:8000/p/magda/tensorflow-example/executions/create/?from=01744e18-c946-53e7-5780-03267649539d',+                                 'copy_to_task': 'http://127.0.0.1:8000/p/magda/tensorflow-example/tasks/create/?from=01744e18-c946-53e7-5780-03267649539d',+                                 'display': 'http://127.0.0.1:8000/p/magda/tensorflow-example/execution/01744e18-c946-53e7-5780-03267649539d/',+                                 'stop': 'http://127.0.0.1:8000/api/v0/executions/01744e18-c946-53e7-5780-03267649539d/stop/'}},+               'id': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'log': [{'ctime': '2020-09-02T09:14:41.372952Z',+                        'identifier': '',+                        'kind': 'other',+                        'message': 'Created execution tensorflow-example/#24'}],+               'name': 'preprocess',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'status': 'started',+               'type': 'execution'}],+    'project': {'ctime': '2020-08-28T07:17:39.731967Z',+                'description': '',+                'enabled_endpoint_count': None,+                'execution_count': None,+                'id': '017433ed-d913-7f38-610d-6121caf6d31c',+                'last_execution_ctime': None,+                'mtime': '2020-08-28T07:17:39.732000Z',+                'name': 'tensorflow-example',+                'owner': {'id': 1, 'username': 'magda'},+                'queued_execution_count': None,+                'running_execution_count': None,+                'url': 'http://127.0.0.1:8000/api/v0/projects/017433ed-d913-7f38-610d-6121caf6d31c/',+                'urls': {'display': 'http://127.0.0.1:8000/p/magda/tensorflow-example/',+                         'display_repository': 'http://127.0.0.1:8000/p/magda/tensorflow-example/settings/repository/'}},
    'project': PROJECT_DATA,
magdapoppins

comment created time in 18 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

     }, } +PIPELINE_DATA = {+    'counter': 21,+    'creator': {'id': 1, 'username': 'magda'},+    'ctime': '2020-09-02T09:14:41.216963Z',+    'deleted': False,+    'edges': [{'id': '01744e18-c8cf-4ed9-0a44-412f65b0f224',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*train-images*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'training-set-images',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8e3-492d-135e-a1ef237c4a94',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*train-labels*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'training-set-labels',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8e6-e27d-442b-b820c8975a37',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*test-images*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'test-set-images',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8e9-924a-7f89-d4033b0bd1a4',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': '*test-labels*',+               'source_node': '01744e18-c8c1-a267-44a2-5e7f0a86bf38',+               'source_type': 'output',+               'target_key': 'test-set-labels',+               'target_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'target_type': 'input'},+              {'id': '01744e18-c8ec-d09c-4e5e-c749fbbf1e80',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'source_key': 'model*',+               'source_node': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'source_type': 'output',+               'target_key': 'model',+               'target_node': '01744e18-c8c9-167b-c2b6-a9f06d0c1735',+               'target_type': 'input'}],+    'id': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+    'log': [{'ctime': '2020-09-02T09:14:41.375499Z',+             'identifier': 'node_started',+             'kind': 'other',+             'message': 'Node "preprocess" started'},+            {'ctime': '2020-09-02T09:14:41.271304Z',+             'identifier': 'started',+             'kind': 'other',+             'message': 'Pipeline started'}],+    'nodes': [{'execution': None,+               'id': '01744e18-c8c7-a1d4-d0be-0cb3cbfa7ee0',+               'log': [],+               'name': 'train',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'status': 'created',+               'type': 'execution'},+              {'execution': None,+               'id': '01744e18-c8c9-167b-c2b6-a9f06d0c1735',+               'log': [],+               'name': 'evaluate',+               'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+               'status': 'created',+               'type': 'execution'},+              {'execution': {'counter': 24,+                             'creator_name': 'magda',+                             'ctime': '2020-09-02T09:14:41.357081Z',+                             'cumulative_metadata': None,+                             'deleted': False,+                             'duration': None,+                             'end_time': None,+                             'id': '01744e18-c946-53e7-5780-03267649539d',+                             'n_alerts': 0,+                             'n_comments': 0,+                             'n_outputs': 0,+                             'pipeline': '01744e18-c8bf-e6cd-d329-5de7ed61bb52',+                             'project': '017433ed-d913-7f38-610d-6121caf6d31c',+                             'status': 'queued',+                             'step': 'Preprocess dataset (MNIST)',+                             'tags': [],+                             'task': None,+                             'task_counter': None,+                             'title': '',+                             'url': 'http://127.0.0.1:8000/api/v0/executions/01744e18-c946-53e7-5780-03267649539d/',+                             'urls': {+                                 'copy': 'http://127.0.0.1:8000/p/magda/tensorflow-example/executions/create/?from=01744e18-c946-53e7-5780-03267649539d',+                                 'copy_to_task': 'http://127.0.0.1:8000/p/magda/tensorflow-example/tasks/create/?from=01744e18-c946-53e7-5780-03267649539d',+                                 'display': 'http://127.0.0.1:8000/p/magda/tensorflow-example/execution/01744e18-c946-53e7-5780-03267649539d/',+                                 'stop': 'http://127.0.0.1:8000/api/v0/executions/01744e18-c946-53e7-5780-03267649539d/stop/'}},
              {'execution': EXECUTION_DATA,
magdapoppins

comment created time in 18 days

PullRequestReviewEvent

push eventsuomipelit/suomipelit.github.io

Aarni Koskela

commit sha 046f584edc21c7ae8c7564a67d4222731c8928d3

Fix layout on mobile, once again

view details

push time in 19 days

create barnchvalohai/jupyhai_intellij_plugin

branch : old

created branch time in 19 days

push eventvalohai/jupyhai_intellij_plugin

Drazen Dodik

commit sha d695d7cba82f06a63e8d4b2ee51ecbb043d05b27

Initial commit

view details

push time in 19 days

push eventsuomipelit/suomipelit.github.io

Aarni Koskela

commit sha c725fd4b17dd19279979141d39a2020d1097c815

Slightly emslimmen things

view details

push time in 20 days

issue commentAzure/msrestazure-for-python

Maximum retry-after time

Any progress or ideas on this? Just bumped into this again, with a script stalling without any log output since the time.sleep() it's entering isn't quite logged anywhere.

akx

comment created time in 20 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

+import click++from valohai_cli.api import request+from valohai_cli.commands.pipeline.run.utils import build_edges, build_nodes, match_pipeline+from valohai_cli.ctx import get_project+from valohai_cli.messages import success, error+++@click.command(+    context_settings=dict(ignore_unknown_options=True),+    add_help_option=False+)+@click.argument('name', required=False, metavar='PIPELINE-NAME')+@click.option('--commit', '-c', default=None, metavar='SHA', help='The commit to use. Defaults to the current HEAD.')+@click.option('--title', '-c', default=None, metavar='SHA', help='The optional title of the pipeline run.')

Metavar is not SHA here; leave it out :)

@click.option('--title', '-c', default=None, help='The optional title of the pipeline run.')
magdapoppins

comment created time in 20 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

+import click++from valohai_cli.api import request+from valohai_cli.commands.pipeline.run.utils import build_edges, build_nodes, match_pipeline+from valohai_cli.ctx import get_project+from valohai_cli.messages import success, error+++@click.command(+    context_settings=dict(ignore_unknown_options=True),+    add_help_option=False+)+@click.argument('name', required=False, metavar='PIPELINE-NAME')+@click.option('--commit', '-c', default=None, metavar='SHA', help='The commit to use. Defaults to the current HEAD.')+@click.option('--title', '-c', default=None, metavar='SHA', help='The optional title of the pipeline run.')+@click.pass_context+def run(ctx, name, commit, title):+    """+    Start a pipeline run.+    """+    # Having to explicitly compare to `--help` is slightly weird, but it's because of the nested command thing.+    if name == '--help' or not name:+        click.echo(ctx.get_help(), color=ctx.color)+        try:+            config = get_project(require=True).get_config(commit_identifier=commit)+            if config.pipelines:+                click.secho('\nThese pipelines are available in the selected commit:\n', color=ctx.color, bold=True)+                for pipeline in sorted(config.pipelines):+                    click.echo('   * %s' % pipeline, color=ctx.color)+        except:  # If we fail to extract the pipeline list, it's not that big of a deal.+            pass+        ctx.exit()++    project = get_project(require=True)+    commit = commit or project.resolve_commit()['identifier']+    config = project.get_config()++    matched_pipeline = match_pipeline(config, name)+    pipeline = config.pipelines[matched_pipeline]++    start_pipeline(config, pipeline, project.id, commit, title)+++def start_pipeline(config, pipeline, project_id, commit, title=None):+    edges = build_edges(pipeline)+    nodes = build_nodes(commit, config, pipeline)+    payload = {+        "edges": edges,+        "nodes": nodes,+        "project": project_id,+        "title": title if title else pipeline.name,
        "title": title or pipeline.name,
magdapoppins

comment created time in 20 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

 def handle_create_execution(self, request, context):         context.status_code = 201         return EXECUTION_DATA.copy() +    def handle_create_pipeline(self, request, context):+        body_json = json.loads(request.body.decode('utf-8'))+        assert body_json['project'] == self.project_id+        assert len(body_json['edges']) == 5+        assert len(body_json['nodes']) == 3+        context.status_code = 201+        return EXECUTION_DATA.copy()

The pipeline create response probably isn't an execution create response...

magdapoppins

comment created time in 20 days

Pull request review commentvalohai/valohai-cli

Pipeline runs

+from click import BadParameter+from pytest import raises++from tests.commands.execution.run_test_utils import RunAPIMock+from tests.fixture_data import PROJECT_DATA+from valohai_cli.commands.pipeline.run import run+from valohai_cli.commands.pipeline.run.utils import match_pipeline+from valohai_cli.ctx import get_project+++def test_pipeline_run_success(runner, logged_in_and_linked):+    add_valid_pipeline_yaml()+    args = ['training']+    with RunAPIMock(PROJECT_DATA.get('id')) as m:+        output = runner.invoke(run, args).output+    assert 'Success' in output+++def test_pipeline_run_no_name(runner, logged_in_and_linked):+    add_valid_pipeline_yaml()+    args = ['']+    with RunAPIMock(PROJECT_DATA.get('id')) as m:+        output = runner.invoke(run, args).output+    assert 'Usage: ' in output+++def test_match_pipeline(runner, logged_in_and_linked):+    add_valid_pipeline_yaml()+    config = get_project().get_config()+    matches = match_pipeline(config, 'Training')+    assert matches == "Training Pipeline"+++def test_match_pipeline_ambiguous(runner, logged_in_and_linked):+    add_valid_pipeline_yaml()+    config = get_project().get_config()+    with raises(BadParameter):+        match_pipeline(config, 'Train')+++def add_valid_pipeline_yaml():+    with open(get_project().get_config_filename(), 'w') as yaml_fp:+        yaml_fp.write('''

I think this data could live in tests/fixture_data.py?

magdapoppins

comment created time in 20 days

PullRequestReviewEvent
PullRequestReviewEvent

push eventakx/moo

Aarni Koskela

commit sha a8fcacc9565505fc68a3f554abba0da0ddbd7f05

Upgrade docs to 1.35

view details

push time in 20 days

create barnchakx/moo

branch : master

created branch time in 20 days

created repositoryakx/moo

created time in 20 days

issue commentsilvia-odwyer/photon

"Performance" section on web site

The benchmarks page at https://github.com/silvia-odwyer/photon/wiki/Benchmarks still doesn't show any methodology for the Python Imaging Library results 😞

akx

comment created time in 20 days

PR opened googleapis/google-auth-library-python

fix: remove checks for ancient versions of Cryptography

Refs https://github.com/googleapis/google-auth-library-python/issues/595#issuecomment-683903062

I see no point in checking whether someone is running a version of https://github.com/pyca/cryptography/ from 2014 that doesn't even compile against modern versions of OpenSSL anymore.

+0 -25

0 comment

2 changed files

pr created time in 21 days

create barnchakx/google-auth-library-python

branch : cheap-cryptography-check

created branch time in 21 days

issue commentgoogleapis/google-auth-library-python

Setuptools as dependency is problematic w/ pip-tools

Hey @busunkim96, thanks for the response. :)

Considering the issue in #322 was only a warning, and even so only manifested when using https://github.com/pantsbuild/pex (which, as far as I know and have used it, is used for application packaging, not library packaging), I think (well, with hindsight being 20:20 and all) the real fix would be for the pex-built project to require a newer setuptools, not this library.

That said, though, I wonder if pkg_resources (from setuptools) is required at all anymore here:

~/b/google-auth-library-python (master) $ git grep pkg_res
docs/conf.py:import pkg_resources
docs/conf.py:version = pkg_resources.get_distribution("google-auth").version
google/__init__.py:    import pkg_resources
google/__init__.py:    pkg_resources.declare_namespace(__name__)
google/auth/crypt/_cryptography_rsa.py:import pkg_resources
google/auth/crypt/_cryptography_rsa.py:    release = pkg_resources.get_distribution("cryptography").parsed_version
google/auth/crypt/_cryptography_rsa.py:    if release < pkg_resources.parse_version("1.4.0"):
google/auth/crypt/_cryptography_rsa.py:except pkg_resources.DistributionNotFound:  # pragma: NO COVER
google/auth/crypt/es256.py:import pkg_resources
google/auth/crypt/es256.py:    release = pkg_resources.get_distribution("cryptography").parsed_version
google/auth/crypt/es256.py:    if release < pkg_resources.parse_version("1.4.0"):
google/auth/crypt/es256.py:except pkg_resources.DistributionNotFound:  # pragma: NO COVER

It seems to be used for two things in library code:

  • declaring google as a namespace package; considering the hitch-hiker's guide to packaging prefers native namespace packages that have been a thing since 3.3 and this package doesn't support versions <3.3 anyway (https://github.com/googleapis/google-auth-library-python/blob/6269643ee675f02c39795e163a12da6b27d991d2/setup.py#L49), that could go away.
  • figuring out whether the cryptography package is >= 1.4. Cryptography has exposed cryptography.__version__ for at least 7 years so I think just checking that would be enough. Or, you know, just require cryptography >= 1.4 – it's been out since 2014.
akx

comment created time in 21 days

created tagvalohai/minique

tagv0.2.0

A minimal Redis job queue for Python 3.

created time in 21 days

more