profile
viewpoint
Chris Petersen ex-nerd Seattle http://ex-nerd.com This statement is false.

ex-nerd/gmtasks 27

Multi-process Gearman Task Server library

cookbrite/Recipe-to-Markdown 14

Scrape recipes from AllRecipes.com to text/Markdown. Use at your own risk.

ex-nerd/cpool 6

Simple connection pool class for Python

Beirdo/beirdobot 4

Beirdo's C IRC bot

cookbrite/cwac-camera 3

This public fork of CWAC-Camera library concentrates on providing camera preview and photo review screens that are locked to portrait orientation. Our use of the library is limited to certain code paths, so use at your own risk! If in doubt, you should use the original repository published by commonsguy.

ex-nerd/audiotools 3

Miscellaneous tools for managing/cleaning audio files (mp4, m4a, m4b, flac, etc)

cookbrite/spoon 1

Distributing instrumentation tests to all your Androids.

ex-nerd/brainwave-arduino 1

Arduino hardware bundle for Brainwave board

ex-nerd/fake-gcs-action 1

GitHub Action for running fake-gcs-server in a dettached container in background.

cookbrite/ebs-deploy 0

Python based command line tools for managing Amazon Elastic Beanstalk applications.

issue commentBackMarket/github-mermaid-extension

Update browser extensions to latest Mermaid

Seems this is somewhat a dupe of #21

brunowowk

comment created time in a month

pull request commentBackMarket/github-mermaid-extension

Compatibility with mermaid 8.4.4

Mermaid is now up to v8.6 and development seems to be progressing fairly quickly.

Maybe there is a way to auto-update the js file so this extension doesn't need to be updated every time mermaid puts out a new version?

n8v-guy

comment created time in a month

issue commentBackMarket/github-mermaid-extension

Update browser extensions to latest Mermaid

It would be even better if the plugin auto-updated its internal mermaid.js file.

brunowowk

comment created time in a month

push eventex-nerd/ex-nerd.github.io

Chris Petersen

commit sha 1735d6309a0dfe653a7bbe9e2ae0b5c5dc0ae232

push

view details

push time in a month

push eventex-nerd/ex-nerd.hugo

Chris Petersen

commit sha c3560f79605267172fe9487c9bd3b45137bfdd09

broken diapedia links to internet archive

view details

Chris Petersen

commit sha 29ad30c63034d3ac449bf29613895120dc483e05

add SPIDDM to t1 name history

view details

push time in a month

push eventex-nerd/ex-nerd.github.io

Chris Petersen

commit sha 75528b3957022c828db6d14d5f3d453d3a699acf

push

view details

push time in a month

issue commentmicrosoft/vscode-python

python.defaultInterpreterPath doesn't always work (plus feedback/concerns about DeprecatePythonPath experiment)

Thanks, and that's fair. The original wording I used was based on the unclear wording in the wiki about the use of python.defaultInterpreterPath.

I think much of this could be solved by making python.defaultInterpreterPath a setting allowed by the workspace settings. As a "default" it would only come into play if the user does not have an interpreter set up, and it allows projects to provide a suggested/preferred interpreter to use (which is really important when so many python projects have both minimum and maximum python version dependencies due to differences in python 2.7, 3.5, 3.7, 3.8, and virtual envs are so prevalent to limit interactions between third-party packages).

Though a better idea might be to add a different workspace-specific setting python.recommendedInterpreterPath, allowing for the user to be prompted when their configured path is different ... sort of like how a project can have recommended extensions and the user is notified if any of those are not installed (and those who want to can dismiss the recommendation in such a way that it never reappears). My biggest concern with this is the user experience of the whole thing:

  1. The project's preferred python path didn't show up in the menu to select a new interpreter so I had to manually track it down in the terminal and paste in the entire path
    • This part may simply be a caching issue, since restarting vscode in between re/creating the python environments did seem to let me type in each directory name to traverse my way to the proper python version (I only tested "restart in between" once so something else may have triggered the "fixed" behavior).
  2. There is no way to notify the user that their currently-configured path is "incorrect" (for a specific interpreter preferred/needed by the project) or at least "incompatible" (for projects that are only compatible with certain python versions).
    • Basically, it's unhelpful to tell the user "hey, you don't have python configured for this workspace ... please pick your interpreter" while also preventing the project from providing even a suggestion to the user which one they should use.
    • Simply allowing the user to try to use an incompatible interpreter would trigger all kinds of unnecessary warnings in the IDE and potentially waste the user's time tracking down "problems with the code" when it simply's an incorrect environment. In fact, this is my primary concern, since one of the main reasons we use per-project project-managed python envs is specifically because so much time has been wasted helping other developers track down issues that turned out to simply be related to a misconfigured or out of date environment.
ex-nerd

comment created time in a month

issue commentmicrosoft/vscode-python

python.defaultInterpreterPath doesn't always work (plus feedback/concerns about DeprecatePythonPath experiment)

@karrtikr Yes... Granted, part of this is confusion around the experiment info page I was sent to when vscode notified me that I'd been opted in (which to my recollection didn't mention that python.defaultInterpreterPath was not allowed in workspace settings, simply that it replaced python.pythonPath with slightly different functionality).

But you keep answering me as if I'm not specifically reporting it as a bug that I can't control the python interpreter from my workspace settings like I can if I opt out of this experiment and revert to previous behavior. This new experimental behavior previous convention and forces users to manually type in a path they may not know every time the project changes its interpreter settings (because for some reason it doesn't show up in the picker .. and I can't even use something like python.defaultInterpreterPath in the workspace config to ensure that it does show up in the picker). The experiment asked for feedback and told me to come here to provide it. So I'm providing feedback: the current implementation breaks legitimate use cases where the project maintainers (e.g. my employer and several previous ones) use a .vscode/settings.json file committed into the code base to ensure that all developers use the exact same version of python, which includes automatically switching them to new/different versions as they switch between branches. Storing manually-specified values outside of the project workspace is fine ... but it is a mistake to remove the ability for the workspace config to specify what its defaults should be.

ex-nerd

comment created time in a month

issue commentmicrosoft/vscode-python

python.defaultInterpreterPath doesn't always work (plus feedback/concerns about DeprecatePythonPath experiment)

@karrtikr As far as I can tell, the wiki is saying that my situation is an intentional choice to break projects that use workspace-specific interpreters controlled by the workspace itself:

A new user setting python.defaultInterpreterPath is introduced which is meant as a replacement for python.pythonPath in user scope, but not in workspace scope.

I'm trying to use python.defaultInterpreterPath in a workspace scope. The whole point of filing this bug is users shouldn't have to go out of their way to manually find/select an interpreter path if the workspace settings know what it should be (and it would be an error if that interpreter doesn't exist ... falling back to some other copy of python could result in bad code being added).

ex-nerd

comment created time in a month

issue commentmicrosoft/vscode-python

python.defaultInterpreterPath doesn't always work (plus feedback/concerns about DeprecatePythonPath experiment)

@karrtikr If by "in workspace settings" you mean that the default interpreter path and all other settings are stored in .vscode/settings.json in the folder in my git repository, yes. That's the whole point of configuring a separate interpreter for each of our projects. vscode is usually invoked by cd into the same directory as the source and running code . (to pick up several environment variables that only exist within that directory)

If you're unable to reproduce this by using global settings, then it sounds like this is definitely a bug if vscode handles the workspace settings differently.

ex-nerd

comment created time in a month

IssuesEvent

issue commentpython-attrs/attrs

Handling of None when assigned to non-optional attribute

Wait, I misread the description on that converter, and there are some issues with it. While that is the behavior I'm suggesting here, use of the default_if_none converter has some issues:

  • converters don't have access to the model, so using this will require specifying the default value twice
    • this leads to potential problems if a user updates one but not the other
    • if you don't also include the default parameter for the attribute, the below example would result in a TypeError: __init__() missing 1 required positional argument: 'i'
  • doesn't actually work when explicitly setting a value to None

See:

import attr
from attr.converters import default_if_none

@attr.s()
class Test:
    i: str = attr.ib(default="init",converter=default_if_none(default="converter"))

# Sets via init: Test(i='init')
print(repr(Test()))

# Sets via converter: Test(i='converter')
print(repr(Test(i=None)))

# Doesn't trigger converter: Test(i=None)
t = Test(i="original")
t.i = None
print(repr(t))
ex-nerd

comment created time in a month

issue closedpython-attrs/attrs

Handling of None when assigned to non-optional attribute

It would be nice to have control over what happens when someone attempts to set a non-optional attribute values explicitly to None: set default value, or raise ValueError.

The following example shows a rough example, including a possible use case dealing with deserialized data that might have inappropriate or missing values (yes, I'm aware of cattrs but that would require constructing a full object rather than simply applying partial updates).

import attr

@attr.s()
class Test:
    i: int = attr.ib(default=3)
    l: list = attr.ib(factory=list)

t1 = Test(i=None, l=None)
print(repr(t1))
# Actual: Test(i=None, l=None)
# Desired: Test(i=3, l=[])

input = {i=None}
t2 = Test(i=5, l=[1,2,3])
t2.i = input.get('i')
t2.l = input.get('l')
print(repr(t2))
# Actual: Test(i=None, l=None)
# Desired: Test(i=3, l=[])

While I'm aware there may be some concerns about modifying existing behavior (and thus unintentionally breaking code and existing workarounds), these behaviors could initially be enabled via flags on attr.ib, e.g. attr.ib(default=3, default_if_none=True) or attr.ib(default=3, error_if_none=True).

This is sort of the inverse of #460

closed time in a month

ex-nerd

issue commentpython-attrs/attrs

Handling of None when assigned to non-optional attribute

@hynek That's exactly what I'm looking for. Not sure how I missed that in the docs, thanks!

ex-nerd

comment created time in a month

issue openedmicrosoft/vscode-python

python.defaultInterpreterPath doesn't always work (plus feedback/concerns about DeprecatePythonPath experiment)

bug

After finding myself one of the 4% opted into the DeprecatePythonPath experiment, I've noticed some inconsistency in how the interpreter path is handled, and believe there is buggy behavior in how python.defaultInterpreterPath is implemented. The replication is as follows:

  • Clear out your local storage so you start fresh
  • Set python.defaultInterpreterPath to your preferred python path (e.g. a local pyenv installation of 3.7.2)
  • Load up your project and notice that it detects the interpreter path properly
  • Exit vscode
  • Delete your python install and install a new one, simulating an upgrade (e.g. 3.7.4)
  • Don't forget to update python.defaultInterpreterPath to point to the new interpreter path
  • Reopen vscode and load up your project

Actual results:

vscode has now detected that the interpreter it had been using is no longer available, and is now in a state where there is no interpreter configured, and a user has to manually specify one.

Expected results:

vscode should detect that the cached state of the interpreter is incorrect (which it seems to do) but instead of "prompting" (it's hardly a "prompt") the user to select a new interpreter, it should fall back to the configured python.defaultInterpreterPath (after all, that's what a default is for).

Additionally:

  • If you run this and choose not to fully delete the original python version, vscode will happily continue to use this old version instead of the one the project maintainers would prefer to enforce (see below for justification and use case).
  • python envs installed in these subdirectories are not detected by vscode's picker and do not always show up with autocompletion so I had to manually type out (copy/paste) the entire path in order to use it (this seems to have improved after wiping my local storage cache directories, but is still annoying).

Other thoughts

I work on a team that uses direnv to enforce a specific environment for each of our projects. This includes maintaining different versions of python in the project directories based on where these will be deployed (e.g. App Engine has access to 3.8 but App Engine Flex only 3.7.2), and allowing us to rebuild these environments on the fly, test out different versions on different branches (e.g. when Flex gets 3.8 we'll make a branch to test that but still need to switch back and forth to work on "known stable" branches).

Since it seems likely from comments on #2125 that this experiment will become permanent, my initial solution to both of these was to add something the following to the shared settings.json files in each repository to ensure that everyone is opted in to this experiment and will default to the proper python instance:

{
  "python.experiments.optInto": ["DeprecatePythonPath - experiment"],
  "python.defaultInterpreterPath": "${workspaceFolder}/.direnv/python-3.7.2/bin/python3",
  // ...
}

However, due to the aforementioned bug, this means that many of our projects open up the "first" time with no interpreter set (seemingly conflicting with whatever value was cached before the experiment was activated), and that any attempts to upgrade or switch python interpreters for a specific project will mean the user has to manually specify the python interpreter every time he/she switches between branches with different python versions activated (and know which version to select, which means cracking open settings.json or our .envrc to see). This is not an improvement as far as usability goes.

So for now, I've opted my team out of this experiment (thanks for adding that option) until the usability issues have been resolved, and/or there are options that once again allow for project settings to control the python path:

{
  "python.experiments. optOutFrom": ["DeprecatePythonPath - experiment"],
  "python.pythonPath": "${workspaceFolder}/.direnv/python-3.7.2/bin/python3",
  // ...
}

created time in 2 months

issue commentpython-attrs/attrs

Handling of None when assigned to non-optional attribute

@hynek That would solve one of the use cases but validators can't change the value, can they? And if so, wouldn't have access to what the default value should be.

ex-nerd

comment created time in 2 months

issue openedpython-attrs/attrs

Handling of None when assigned to non-optional attribute

It would be nice to have control over what happens when someone attempts to set a non-optional attribute values explicitly to None: set default value, or raise ValueError.

The following example shows a rough example, including a possible use case dealing with deserialized data that might have inappropriate or missing values (yes, I'm aware of cattrs but that would require constructing a full object rather than simply applying partial updates).

import attr

@attr.s()
class Test:
    i: int = attr.ib(default=3)
    l: list = attr.ib(factory=list)

t1 = Test(i=None, l=None)
print(repr(t1))
# Actual: Test(i=None, l=None)
# Desired: Test(i=3, l=[])

input = {i=None}
t2 = Test(i=5, l=[1,2,3])
t2.i = input.get('i')
t2.l = input.get('l')
print(repr(t2))
# Actual: Test(i=None, l=None)
# Desired: Test(i=3, l=[])

While I'm aware there may be some concerns about modifying existing behavior (and thus unintentionally breaking code and existing workarounds), these behavior could initially be enabled via flags on attr.ib, e.g. attr.ib(default=3, default_if_none=True) or attr.ib(default=3, error_if_none=True).

This is sort of the inverse of #460

created time in 2 months

push eventex-nerd/scripts

Chris Petersen

commit sha 884b9bf88e04aa77b74a7ed52ebce417f79ad010

Switch gcloud config over to the one installed by homebrew

view details

push time in 2 months

Pull request review commentfsouza/fake-gcs-action

remove hard-coded pathname so test data populates

 INPUT_PUBLIC_HOST=$(printenv INPUT_PUBLIC-HOST) docker_image=fsouza/fake-gcs-server:${INPUT_VERSION}  if [ -n "${INPUT_DATA}" ]; then-	if ! [ -d "${INPUT_DATA}" ]; then+	if [ -n "${INPUT_DEBUG}" ]; then+		echo "INPUT_DATA=${INPUT_DATA}"+		echo "RUNNER_WORKSPACE=${RUNNER_WORKSPACE}"+		echo "GITHUB_WORKSPACE=${GITHUB_WORKSPACE}"+		echo "HOME=${HOME}"+		env

And it's gone.

ex-nerd

comment created time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 6ee9ad6d573292861056b27beabbc5f687eb6f5b

Remove stray debug call to `env`

view details

push time in 2 months

Pull request review commentfsouza/fake-gcs-action

remove hard-coded pathname so test data populates

 INPUT_PUBLIC_HOST=$(printenv INPUT_PUBLIC-HOST) docker_image=fsouza/fake-gcs-server:${INPUT_VERSION}  if [ -n "${INPUT_DATA}" ]; then-	if ! [ -d "${INPUT_DATA}" ]; then+	if [ -n "${INPUT_DEBUG}" ]; then+		echo "INPUT_DATA=${INPUT_DATA}"+		echo "RUNNER_WORKSPACE=${RUNNER_WORKSPACE}"+		echo "GITHUB_WORKSPACE=${GITHUB_WORKSPACE}"+		echo "HOME=${HOME}"+		env

oops. yeah that is debug code left over.

ex-nerd

comment created time in 2 months

pull request commentfsouza/fake-gcs-action

remove hard-coded pathname so test data populates

Yeah, things should now work properly once this merges.

ex-nerd

comment created time in 2 months

issue openedfsouza/fake-gcs-server

Patch call with new content-type does nothing

As far as I can see in the fake-gcs-server code, contentType is only set when the object is created, and can't be updated.

I'm using the official python gcs library, basically as follows:

# init
client = BigMessyConnectionToFakeGcsServer()
bucket = client.bucket("my-test-bucket")
key = "test-key"

# upload some data
blob = bucket.blob(key)
blob.upload_from_string(b"some data", content_type="application/octet-stream")

# now fix the content type
blob = bucket.get_blob(key)
blob.content_type = "text/plain"
blob.patch()

The content-type value returned from the API request (buried in google's patch() method definition) is the original unchanged application/octet-stream, as is the content-type when trying to download the blob.

Debugging into google's code, this is sending {"contentType": "application/some-new-type"} along with the API request as the body content, which seems to be completely ignored by Server.patchObject unmarshaling into its metadata struct.

created time in 2 months

issue commentgabriel-vasile/mimetype

Python wrapping?

@h2non what if you need support for things like *.doc files?

DonaldTsang

comment created time in 2 months

issue commentfsouza/fake-gcs-server

Internal Error when trying to upload

Additionally, the upload seems to be pulling the key from the name query string property (similarly to media) or from the file metadata, rather than from the URL path itself.

A standard signed URL request looks like this:

https://storage.googleapis.com/my-bucket-name/test-file-for-signed-urls?X-Goog-Algorithm=GOOG4-RSA-SHA256&X-Goog-Credential=<MY-CREDENTIALS>&X-Goog-Date=20200617T035632Z&X-Goog-Expires=3600&X-Goog-SignedHeaders=host&X-Goog-Signature=<LONG-SIGNATURE-HERE>

I assume the key name is embedded in the signature (at least for verification purposes), but the bucket and key name are also right there on the URL. It would make much more sense for fake-gcs-server to honor the key name on the URL path rather than require a separate nonstandard property that GCS seems happy to ignore.

Note: I discovered this issue when using uploadType=media as a workaround for this bug, so that may also need similar behavior (I don't know how that particular uploadType works in actual GCS).

ex-nerd

comment created time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 8a9b9e1d8bb8a7d6c5ba2bc2e6132ffef94dbce7

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 254540fafb7018ce1d1ff2901189d501dba52ab3

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha c598d702b5fecff674e6d03fe50afab661d80420

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 17f20c6b7de62485947d102866c9024628003f2d

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha e418cc22470b79892ad46de6d4be3caac66a5170

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha bac918cc2b7026c0929134594884d3be08729185

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha d8b2a6ad279c594aa775a12275c273afd24eda7a

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha d6b3236113f12d7d0a5f38270082f160f327f31b

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, since it is specific to this repository. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 46db0724e5d507c662bb4e4f10314bd006ac413f

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha a5a3717877da4a59f547b79594225b5a610373d5

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 37ded15078b95643c770ae6716604eeaaf98bd7c

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha f657dee018af220db1e6fea21a292d7a2f157244

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 66ca3a05ca80db0cd4870a346597886e6164acd1

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 27bd9dbe97ed6ff0f5225a7b9f2f1e758d747e86

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha b9784fcbdc12f8d5694a113f65a9452263d6a7e1

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 66bb9f74dc5907f43a3fd6e548e7f06f47061522

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha da70209cda4b4afa9d99d424425656075de532a2

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 6417ba162e9821bfbf0b79e02704b0138ea226b7

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 72e4b73e31cd3ea3e55cbe777c563b16693f4a31

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha c00d03dc23adede92a011280f4b6ec73319bb897

remove hard-coded pathname so test data populates - Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check. - Updates documentation and this repo's own github action to reflect path requirements.

view details

push time in 2 months

startedex-nerd/fake-gcs-action

started time in 2 months

PR opened fsouza/fake-gcs-action

remove hard-coded pathname so test data populates

When starting this up with a test-data directory in my own code base, nothing is populated into the action.

I believe I've tracked it down to the code in this PR, but can't be sure until this builds.

+5 -4

0 comment

1 changed file

pr created time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha 0950342804ee764379297b276552016b73a50950

remove hard-coded pathname so test data populates Removes the hard-coded `fake-gcs-action` part of the `INPUT_DATA` path, and moves the path expansion up before the "exists" check.

view details

push time in 2 months

fork ex-nerd/fake-gcs-action

GitHub Action for running fake-gcs-server in a dettached container in background.

fork in 2 months

issue commentfsouza/fake-gcs-server

Github Action?

Just to confirm, this works great. Thanks for setting this up.

ex-nerd

comment created time in 2 months

issue commentPyCQA/pycodestyle

E721: `Do not compare types, use 'isinstance()'` is overzealous with `type(None)`

@FichteFoll Interesting ... I hadn't thought about reversing the two properties in isinstance()as a workaround.

ex-nerd

comment created time in 2 months

delete branch ex-nerd/fake-gcs-action

delete branch : patch-1

delete time in 2 months

PR opened fsouza/fake-gcs-action

Fix docker command

It seems that --port isn't supported.

/usr/bin/docker run --name d35b0f5a33d8a74ff2ad1b579a31655d39_476277 --label 3888d3 --workdir /github/workspace --rm -e INPUT_VERSION -e INPUT_BACKEND -e INPUT_DATA -e INPUT_PUBLIC-HOST -e INPUT_EXTERNAL-URL -e HOME -e GITHUB_JOB -e GITHUB_REF -e GITHUB_SHA -e GITHUB_REPOSITORY -e GITHUB_REPOSITORY_OWNER -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_ACTOR -e GITHUB_WORKFLOW -e GITHUB_HEAD_REF -e GITHUB_BASE_REF -e GITHUB_EVENT_NAME -e GITHUB_SERVER_URL -e GITHUB_API_URL -e GITHUB_GRAPHQL_URL -e GITHUB_WORKSPACE -e GITHUB_ACTION -e GITHUB_EVENT_PATH -e RUNNER_OS -e RUNNER_TOOL_CACHE -e RUNNER_TEMP -e RUNNER_WORKSPACE -e ACTIONS_RUNTIME_URL -e ACTIONS_RUNTIME_TOKEN -e ACTIONS_CACHE_URL -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/redacted-repo-name/redacted-repo-name":"/github/workspace" 3888d3:5b0f5a33d8a74ff2ad1b579a31655d39
unknown flag: --port
+1 -1

0 comment

1 changed file

pr created time in 2 months

push eventex-nerd/fake-gcs-action

Chris Petersen

commit sha da2d379b21b9d65913e68d9b3aaee5005bb7d76e

Fix docker command It seems that `--port` isn't supported. ``` /usr/bin/docker run --name d35b0f5a33d8a74ff2ad1b579a31655d39_476277 --label 3888d3 --workdir /github/workspace --rm -e INPUT_VERSION -e INPUT_BACKEND -e INPUT_DATA -e INPUT_PUBLIC-HOST -e INPUT_EXTERNAL-URL -e HOME -e GITHUB_JOB -e GITHUB_REF -e GITHUB_SHA -e GITHUB_REPOSITORY -e GITHUB_REPOSITORY_OWNER -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_ACTOR -e GITHUB_WORKFLOW -e GITHUB_HEAD_REF -e GITHUB_BASE_REF -e GITHUB_EVENT_NAME -e GITHUB_SERVER_URL -e GITHUB_API_URL -e GITHUB_GRAPHQL_URL -e GITHUB_WORKSPACE -e GITHUB_ACTION -e GITHUB_EVENT_PATH -e RUNNER_OS -e RUNNER_TOOL_CACHE -e RUNNER_TEMP -e RUNNER_WORKSPACE -e ACTIONS_RUNTIME_URL -e ACTIONS_RUNTIME_TOKEN -e ACTIONS_CACHE_URL -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/redacted-repo-name/redacted-repo-name":"/github/workspace" 3888d3:5b0f5a33d8a74ff2ad1b579a31655d39 unknown flag: --port ```

view details

push time in 2 months

fork ex-nerd/fake-gcs-action

GitHub Action for running fake-gcs-server in a dettached container in background.

fork in 2 months

issue commentfsouza/fake-gcs-server

Internal Error when trying to upload

One other minor point. GCS returns a 204 for signed POST requests, not a 200

ex-nerd

comment created time in 2 months

issue openedfsouza/fake-gcs-server

Internal Error when trying to upload

Extracted from my (now-removed) comment on #217 since it seems to be related, but not identical to that problem.

I'm trying to mimic signed upload URLs. I know my code (with a URL generated by GCS/KMS) works against actual GCS but attempts to generate a URL that will work with fake-gcs-server results in Internal Error.

My replication code looks something like this:

import requests

session = requests.Session()
session.verify = False

resp = session.post(
    "https://my-public-uri.local:4443/my-bucket/my-test-key-to-create?X-Goog-Algorithm=GOOG4-RSA-SHA256&X-Goog-Credential=fake-gcs&X-Goog-Expires=3600&X-Goog-SignedHeaders=host&X-Goog-Signature=fake-gcs'
    files={
        "file": (
            "my-test-key",
            f"this is an upload test",
            "text/plain",
        )
    },
)
print(resp.status_code)

This results in a response with an error message claiming invalid character '-' in numeric literal\n

I've tracked this down to a problem in loadMetadata expecting some form of JSON data but my actual body looks like this:

--ea7516aedd980cfb6672d9ed6301a60b\r\nContent-Disposition: form-data; name="file"; filename="my-test-key"\r\nContent-Type: text/plain\r\n\r\nthis is an upload test\r\n--ea7516aedd980cfb6672d9ed6301a60b--\r\n

If I add &uploadType=multipart to my URL to force this into a multipart upload, the error changes to invalid character 'h' in literal true (expecting 'r')\n from the if err != io.EOF check in multipartUpload().

P.S. It'd be really handy if there was a way to do a stack trace for those errors ... I was lazy and added a bunch of prefix text to the error messages to figure out which was which, but it was a bit tedious.

created time in 2 months

issue commentfsouza/fake-gcs-server

Github Action?

Haven't forgotten about this ... just got stuck in the backlog of work stuff.

ex-nerd

comment created time in 2 months

issue openedPyCQA/pycodestyle

`Do not compare types, use 'isinstance()' (E721)` is overzealous with `type(None)`

Note: May be a dupe of (or at least related to) #882.

I discovered this while writing a decorator to parse the values of typing.get_type_hints(). This isn't exactly a bug in the flake8 feature, but rather an inconsistency in how NoneType is handled in the python3 typing system. This definitely feels like a bug, given that isinstance(none_type, type(None)) != none_type is type(None). If nothing else it's worth documenting the behavior because I couldn't find anything written up about it:

Setup:

# Setup:

from typing import Union

my_type = Union[str, None]
# --> typing.Union[str, NoneType]
none_arg = my_type.__args__[1]
# --> <class 'NoneType'>

Examples/behavior:

# Expected output based on recommendations for validating NoneType in python 3,
# but this triggers e721:
none_arg is type(None)
# --> True

# Just for example:
none_arg is None
# --> False

# Recommended by E721 but not actually valid
isinstance(none_arg, type(None))
# --> False

isinstance(none_arg, None)
# --> TypeError: isinstance() arg 2 must be a type or tuple of types

isinstance(none_arg, (None, ))
# --> TypeError: isinstance() arg 2 must be a type or tuple of types

Current workaround:

# Redefine local copy of NoneType no longer present in Python3
NoneType = type(None)

none_arg is NoneType
# --> True

# However, this fails to pass `mypy` inspection, so I'm now back to:
none_arg is type(None)  # noqa: E721
# --> True

created time in 2 months

issue closedTinche/cattrs

Doesn't work with attrs-converted underscore properties

  • cattrs version: 1.0.0
  • Python version: 3.7.2
  • Operating System: MacOS/pyenv

Description

cattrs is designed to work with attr, which has a bit of magic with underscore/private properties losing their underscores. It looks like cattrs is doing similar magic with this, but it actually fails when the dict being passed in already has the underscore-property cleaned up:

What I Did

import attr
import cattr


@attr.s(auto_attribs=True)
class foo:
    a: str = attr.ib()
    _b: str = attr.ib()


foo(a="a", b="b")
# foo(a='a', _b='b')

cattr.unstructure(foo(a="a", b="b"))
# {'a': 'a', '_b': 'b'}

# This somehow works despite the init for foo() looking for `b` not `_b`
cattr.structure({"a": "a", "_b": "b"}, foo)
# foo(a='a', _b='b')

# But this fails because cattr fails to pass `b` into the __init__ function
cattr.structure({"a": "a2", "b": "b2"}, foo)

closed time in 2 months

ex-nerd

issue commentTinche/cattrs

Doesn't work with attrs-converted underscore properties

Thanks. Figured it was worth asking. In my case, I have classes for some graphql entities acting as both server and client classes (for internal integration tests), so there are a couple of fields that only exist for half of that use and need to be filtered out before being saved to the database (arangodb, like mongodb, has some special underscore properties so I'm already doing some filtering there). It's still easier to add a preprocessor in my factory method than maintain separate classes for all of this stuff (at least until we need a public python-based client library).

ex-nerd

comment created time in 2 months

issue openedTinche/cattrs

Doesn't work with attrs-converted underscore properties

  • cattrs version: 1.0.0
  • Python version: 3.7.2
  • Operating System: MacOS/pyenv

Description

cattrs is designed to work with attr, which has a bit of magic with underscore/private properties losing their underscores. It looks like cattrs is doing similar magic with this, but it actually fails when the dict being passed in already has the underscore-property cleaned up:

What I Did

import attr
import cattr


@attr.s(auto_attribs=True)
class foo:
    a: str = attr.ib()
    _b: str = attr.ib()


x = foo(a="a", b="b")
# foo(a='a', _b='b')

y = cattr.unstructure(x)
# {'a': 'a', '_b': 'b'}

# This somehow works despite the init for foo() looking for `b` not `_b`
cattr.structure(y, foo)
# foo(a='a', _b='b')

# But this fails because cattr fails to pass `b` into the __init__ function
cattr.structure({"a": "a2", "b": "b2"}, foo)

created time in 2 months

delete branch ex-nerd/f-engrave

delete branch : patch-1

delete time in 2 months

issue commentfsouza/fake-gcs-server

gsutil cp bad request

I think I'm running into this same issue in my tests to create signed upload URLs. When using fake-gcs server I just return a "direct" URL without the signing key (since that I can't fake KMS stuff and this works for downloads).

My replication code looks something like this:

import requests

session = requests.Session()
session.verify = False

resp = session.post(
    "https://my-public-uri.local:4443/my-bucket/my-test-key-to-create",
    files={
        "file": (
            "my-test-key",
            f"this is an upload test",
            "text/plain",
        )
    },
)
assert resp.status_code == HTTP_204_NO_CONTENT
jonasfugedi

comment created time in 2 months

issue commentjoowani/python-arango

Incorrect error string returned for missing vertex collections

Done, thanks. https://github.com/arangodb/arangodb/issues/11730

ex-nerd

comment created time in 2 months

issue openedarangodb/arangodb

Incorrect error string returned for missing vertex collections

Attempting to insert an edge referencing a missing vertex returns an an error that the edge collection is missing, rather than that the vertex is missing.

My Environment

  • ArangoDB Version: 3.6.1 via official docker
  • Storage Engine: whatever is default
  • Deployment Mode: single
  • Deployment Strategy: official docker via manual command in docker-composearangodb --starter.mode=single --starter.data-dir=/opt/arangodb --starter.address=0.0.0.0
  • Configuration: nothing special
  • Infrastructure: docker-compose local development
  • Operating System: whatever is in your docker image
  • Total RAM in your machine: 32Gb but only 4GB allocated to docker
  • Disks in use: SSD
  • Used Package: nope

Component, Query & Data

Originally thought this was a bug in python arango filed here but it seems this message is coming straight from arangodb. Requoted here with edits based on new info:

Given a graph G that links V_DOCS to itself via edge collection EDGE_DOCS, I accidentally typo'd my link statement in something that looks roughly like this:

# using python-arango
from arango.database import StandardDatabase
db: StandardDatabase = client.db(repo_name, username, password)
g: Graph = db.graph("G")
edges: EdgeCollection = g.edge_collection("EDGE_DOCS")
edges.insert({"_key": "something", "_from": "V_TYPO/1234", "_to": "V_DOCS/3456"})

The error returned looks like this:

arango.exceptions.DocumentInsertError: [HTTP 404][ERR 1203] collection 'EDGE_DOCS' not found

Certainly EDGE_DOCS exists ... what the error should have been telling me is that the vertex referencing V_TYPO doesn't exist.

I originally assumed this was something in python-arango simply receiving a 404 from arangodb and filling in the EDGE_DOCS collection name on its own, but its author assures me that the error is returned verbatim from arangodb.

Steps to reproduce

Described above.

Problem:

Attempting to insert an edge referencing a missing vertex returns an an error that the edge collection is missing, rather than that the vertex is missing.

Expected result:

The error returned should reference the missing vertex, not the edge collection (which does exist).

created time in 2 months

issue commentfsouza/fake-gcs-server

Github Action?

@fsouza Like I said, I'm happy to contribute a PR if you want to host the repo. I eventually need to get this hooked up for work, anyway.

ex-nerd

comment created time in 3 months

issue commentfsouza/fake-gcs-server

Github Action?

Here's the official documentation: https://help.github.com/en/actions/creating-actions/creating-a-docker-container-action

Anyway, if you're interested, feel free to create an empty repo and I can submit a PR with some contents. Then you'll have to log into the github action marketplace and register it.

ex-nerd

comment created time in 3 months

issue commentfsouza/fake-gcs-server

Github Action?

@fsouza It would allow running of fake-gcs-server as a "service" within a github action. You can see this one I set up for my employer for arangodb: https://github.com/xinova/arangodb-action

yes, it uses the docker image, but you still need provide github with a bit of information about the config. In the case of fake-gcs-server that would presumably mean some overrides for port number, "public" server name, etc.

ex-nerd

comment created time in 3 months

issue openedfsouza/fake-gcs-server

Github Action?

Are there plans to publish a version of this to the github actions marketplace? I can whip up a version of this but it would probably make more sense for you to own the action's repository.

created time in 3 months

issue openedjoowani/python-arango

Incorrect error string returned for missing vertex collections

Given a graph G that links V_DOCS to itself via edge collection EDGE_DOCS, I accidentally typo'd my link statement in something that looks roughly like this:

g: Graph = db.graph("G")
edges: EdgeCollection = g.edge_collection("EDGE_DOCS")
edges.insert({"_key": "something", "_from": "V_TYPO/1234", "_to": "V_DOCS/3456"})

The error returned looks like this:

arango.exceptions.DocumentInsertError: [HTTP 404][ERR 1203] collection 'EDGE_DOCS' not found

Certainly EDGE_DOCS exists ... what the error should have been telling me is that the vertex referencing V_TYPO doesn't exist. I don't know if this information is available in the 404 from arangodb but it would have been more helpful to get a vague error message than spend a bunch of time trying to figure out why I could query my EDGE_DOCS edges in the arangodb UI but this was failing in my python code.

created time in 3 months

push eventex-nerd/f-engrave

Chris Petersen

commit sha d447e2c8813b59e539000d25686ef129af2ff31e

Fix build-macOS.sh to work in more environments Clean up all of the pyenv setup so that it's actually used consistently and won't be affected by a user's own custom environment, the Catalina system python, etc.

view details

push time in 3 months

push eventex-nerd/f-engrave

Chris Petersen

commit sha e8f98b3950ba50978c68d320ffe2557208c05aa0

Fix build-macOS.sh to work in more environments Clean up all of the pyenv setup so that it's actually used consistently and won't be affected by a user's own custom environment, the Catalina system python, etc.

view details

push time in 3 months

PR opened stephenhouser/f-engrave

Update and clean up build-macOS.sh
  • Switch to using echo -e so enhanced/hidden character sequences are honored
  • Consolidate error handling into its own function
  • Add error checking and messages after most commands so the script exits with helpful information when there is a problem
  • Clean up a few whitespace issues
  • Switch to using double-bracket bash conditionals instead of the single-bracket alias for the test program
+54 -25

0 comment

1 changed file

pr created time in 3 months

push eventex-nerd/f-engrave

Chris Petersen

commit sha 7b7652e2dbfc32620854c04e60b5779f22e5d92a

Update and clean up build-macOS.sh - Switch to using `echo -e` so enhanced/hidden character sequences are honored - Consolidate error handling into its own function - Add error checking and messages after most commands so the script exits with helpful information when there is a problem - Clean up a few whitespace issues - Switch to using double-bracket bash conditionals instead of the single-bracket alias for the `test` program

view details

push time in 3 months

fork ex-nerd/f-engrave

Packaging of Scorchworks F-Engrave as an OSX Application

http://www.scorchworks.com/Fengrave/fengrave.html

fork in 3 months

delete branch ex-nerd/f-engrave

delete branch : patch-1

delete time in 3 months

PR closed stephenhouser/f-engrave

fix display of tab characters in help text

echo needs -e to interpret backslashed control characters.

+6 -6

1 comment

1 changed file

ex-nerd

pr closed time in 3 months

pull request commentstephenhouser/f-engrave

fix display of tab characters in help text

Closing this because I have a better one that adds a bunch of error messaging, too.

ex-nerd

comment created time in 3 months

PR opened stephenhouser/f-engrave

fix display of tab characters in help text

echo needs -e to interpret backslashed control characters.

+6 -6

0 comment

1 changed file

pr created time in 3 months

push eventex-nerd/f-engrave

Chris Petersen

commit sha d7e4d0e6c39eec0566077778a7b7b13c4b20fcfd

fix display of tab characters in help text echo needs `-e` to interpret backslashed control characters.

view details

push time in 3 months

fork ex-nerd/f-engrave

Packaging of Scorchworks F-Engrave as an OSX Application

http://www.scorchworks.com/Fengrave/fengrave.html

fork in 3 months

issue closedGoogleCloudPlatform/cloud-debug-python

ERROR: No matching distribution found for google-python-cloud-debugger

As referenced in #8, this is still an issue with Python 3.7 in MacOS. I can see google-python-cloud-debugger (2.13) using pip search but when I go to install it, I get:

ERROR: Could not find a version that satisfies the requirement google-python-cloud-debugger (from versions: none)
ERROR: No matching distribution found for google-python-cloud-debugger

closed time in 3 months

ex-nerd

issue commentGoogleCloudPlatform/cloud-debug-python

ERROR: No matching distribution found for google-python-cloud-debugger

@di I think the main issue here for me is that it's very difficult to use this library while developing something for App Engine, since it requires the library be added to a requirements.txt file ... which then breaks local development. I'm still looking into the use of conditional requirements per PEP 508 (e.g. google-python-cloud-debugger; sys_platform == 'linux') but it would be nice if there as an easy way to make this package "work" on other platforms, even if "work" is simply "installs a dummy package that prints a log message that it doesn't do anything" (which is more or less what's being asked in #9)

It would also help if all of this information about it being linux-only was covered explicitly (and easily visible) in the package documentation.

Anyway, since this is clearly a dupe of #9, I'll close this issue and follow that one.

ex-nerd

comment created time in 3 months

issue commentjoke2k/faker

black the whole code

I've started similar on my own projects, going so far as to add both isort and autoflake in addition to black. It makes diffs reallynice. especially with a pre-commit hook bash script that autoformats everything:

# Sort all inputs
# Sadly doesn't respect .gitignore so just run it selectively:
isort -rc \
    root-directory-of-my-python-code/

# Remove unused variables/imports
# Sadly doesn't respect .gitignore so just run it selectively:
autoflake -r --in-place --remove-unused-variables --remove-all-unused-imports \
    root-directory-of-my-python-code/

# Autoformat everything else
black .

This is supported by a pyproject.toml to make isort and black play nicely together:

# https://black.readthedocs.io/en/stable/pyproject_toml.html#configuration-format
[tool.black]
line-length = 88
target-version = ['py37']
include = '\.pyi?$'
exclude = '''
(
  /(
      \.eggs         # exclude a few common directories in the
    | \.egg-info     # root of the project
    | \.git
    | \.mypy_cache
    | \.tox
    | _build
    | build
    | dist
  )/
  | /(
      \.direnv
    | \.vscode
    | \.envrc
    | \.python
  )/
  | foo.py           # also separately exclude a file named foo.py in
                     # the root of the project (as an example)
)
'''

[tool.isort]
# Settings to match black: https://pypi.org/project/black/
# Keep an eye on https://github.com/psf/black/issues/333 for future compat changes.
multi_line_output = 3
include_trailing_comma = true
force_grid_wrap = 0
use_parentheses = true
line_length = 88

Anyway, hopefully the configs are helpful and save someone some time if others think this is a good idea.

eumiro

comment created time in 3 months

issue closedjoke2k/faker

Please support weighted choices

Since Python 3.6, random.choices has supported passing in weights for the choices, which would allow for more realistic generation of data sets like random languages (#1179). It would be nice of fake.random_elements and its relations could be updated to support this.

Note that this is different than the current dependency random_elements has on OrderedDict (I wonder if this dependency is still needed for python >= 3.6), which references probability values (relative to 1.0) not simple weights (relative to other weight values).

closed time in 3 months

ex-nerd

issue commentjoke2k/faker

Please support weighted choices

@malefice Ah, the docs/example seemed to suggest that the weights were based on a 1.0 scale. That will make things easier to work with. If that's the case, I'll just close this and assume the move away from OrderedDict can be dealt with in the discussion about removing support for python2 (assuming that the issues with python 3's built-in dicts do not still require it).

But @eumiro is correct. There are situations where would be helpful if the data being generated matched more with the real world (particularly when modeling data for search engine development). It could be implemented as an additional flag on the data set, though that starts to get complex if you want to talk about how distributions change based on which locale you're using (e.g. there is a higher percentage of native Chinese speakers in China than in the US). Cool idea for thought, though.

ex-nerd

comment created time in 3 months

issue commentjoke2k/faker

New feature: language_name

Just for posterity, here's a quick language provider (in English names) with distribution based on real-world native speakers.

from collections import OrderedDict
from typing import List

from faker.providers import BaseProvider

# List of languages and numer of (millions) of people who speak them, according to
# https://en.wikipedia.org/wiki/List_of_languages_by_number_of_native_speakers
# Faker requires this be an OrderedDict for random_elements to work.
_languages = OrderedDict(
    [
        ("Mandarin Chinese", 918 / 5567,),
        ("Spanish", 480 / 5567,),
        ("English", 379 / 5567,),
        ("Hindi", 341 / 5567,),
        ("Bengali", 228 / 5567,),
        ("Portuguese", 221 / 5567,),
        ("Russian", 154 / 5567,),
        ("Japanese", 128 / 5567,),
        ("Western Punjabi", 92 / 5567,),
        ("Marathi", 83 / 5567,),
        ("Telugu", 82 / 5567,),
        ("Wu Chinese", 81 / 5567,),
        ("Turkish", 79 / 5567,),
        ("Korean", 77 / 5567,),
        ("French", 77 / 5567,),
        ("German", 76 / 5567,),
        ("Vietnamese", 76 / 5567,),
        ("Tamil", 75 / 5567,),
        ("Yue Chinese", 73 / 5567,),
        ("Urdu", 68 / 5567,),
        ("Javanese", 68 / 5567,),
        ("Italian", 64 / 5567,),
        ("Egyptian Arabic", 64 / 5567,),
        ("Gujarati", 56 / 5567,),
        ("Iranian Persian", 52 / 5567,),
        ("Bhojpuri", 52 / 5567,),
        ("Min Nan Chinese", 50 / 5567,),
        ("Hakka Chinese", 48 / 5567,),
        ("Jin Chinese", 46 / 5567,),
        ("Hausa", 43 / 5567,),
        ("Kannada", 43 / 5567,),
        ("Indonesian", 43 / 5567,),
        ("Polish", 39 / 5567,),
        ("Yoruba", 37 / 5567,),
        ("Xiang Chinese", 37 / 5567,),
        ("Malayalam", 37 / 5567,),
        ("Odia", 34 / 5567,),
        ("Maithili", 33 / 5567,),
        ("Burmese", 32 / 5567,),
        ("Eastern Punjabi", 32 / 5567,),
        ("Sunda", 32 / 5567,),
        ("Sudanese Arabic", 31 / 5567,),
        ("Algerian Arabic", 29 / 5567,),
        ("Moroccan Arabic", 27 / 5567,),
        ("Ukrainian", 27 / 5567,),
        ("Igbo", 27 / 5567,),
        ("Northern Uzbek", 25 / 5567,),
        ("Sindhi", 24 / 5567,),
        ("North Levantine Arabic", 24 / 5567,),
        ("Romanian", 24 / 5567,),
        ("Tagalog", 23 / 5567,),
        ("Dutch", 23 / 5567,),
        ("Saʽidi Arabic", 22 / 5567,),
        ("Gan Chinese", 22 / 5567,),
        ("Amharic", 21 / 5567,),
        ("Northern Pashto", 20 / 5567,),
        ("Magahi", 20 / 5567,),
        ("Thai", 20 / 5567,),
        ("Saraiki", 20 / 5567,),
        ("Khmer", 16 / 5567,),
        ("Chhattisgarhi", 16 / 5567,),
        ("Somali", 16 / 5567,),
        ("Malay (Malaysian Malay)", 16 / 5567,),
        ("Cebuano", 15 / 5567,),
        ("Nepali", 15 / 5567,),
        ("Mesopotamian Arabic", 15 / 5567,),
        ("Assamese", 15 / 5567,),
        ("Sinhalese", 15 / 5567,),
        ("Northern Kurdish", 14 / 5567,),
        ("Hejazi Arabic", 14 / 5567,),
        ("Nigerian Fulfulde", 14 / 5567,),
        ("Bavarian", 14 / 5567,),
        ("South Azerbaijani", 13 / 5567,),
        ("Greek", 13 / 5567,),
        ("Chittagonian", 13 / 5567,),
        ("Kazakh", 12 / 5567,),
        ("Deccan", 12 / 5567,),
        ("Hungarian", 12 / 5567,),
        ("Kinyarwanda", 12 / 5567,),
        ("Zulu", 12 / 5567,),
        ("South Levantine Arabic", 11 / 5567,),
        ("Tunisian Arabic", 11 / 5567,),
        ("Sanaani Spoken Arabic", 11 / 5567,),
        ("Min Bei Chinese", 11 / 5567,),
        ("Southern Pashto", 10 / 5567,),
        ("Rundi", 10 / 5567,),
        ("Czech", 10 / 5567,),
        ("Taʽizzi-Adeni Arabic", 10 / 5567,),
        ("Uyghur", 10 / 5567,),
        ("Min Dong Chinese", 10 / 5567,),
        ("Sylheti", 10 / 5567,),
    ]
)


class LanguageProvider(BaseProvider):
    def languages(self, length: int = None, unique: bool = None) -> List[str]:
        return self.random_elements(_languages, length, unique)

    def language(self) -> str:
        return self.languages(1)[0]
ikhomutov

comment created time in 3 months

issue openedjoke2k/faker

Fake USPTO Patent Numbers

Feels like too much work right now to disable black and automatic formatting stuff I use, and downgrade this to work with python2, and not sure if it's something worthy of fitting into the core of faker. So for now, if you would like a provider that generates proper USPTO patent numbers, here is some code to start with. If this is worthy of a PR, just let me know and I'll submit it as such.

from faker.providers import BaseProvider


class USPTOProvider(BaseProvider):
    """
    Generates random patent numbers based on the rules from the USPTO:
    https://www.uspto.gov/patents-application-process/applying-online/patent-number

    """

    def uspto_num(self) -> str:
        """ Returns a patent number of a random type """
        fn = self.generator.random_element(
            elements=(
                self.uspto_utility,
                self.uspto_reissue,
                self.uspto_plant,
                self.uspto_design,
                self.uspto_ai,
                self.uspto_x,
                self.uspto_h,
                self.uspto_t,
            )
        )
        return fn()

    def uspto_utility(self) -> str:
        # Utility : Patent numbers consist of six, seven or eight digits. Enter the
        # Patent number excluding commas and spaces and omit leading zeroes.
        num_digits = self.generator.random_int(6, 8)
        return str(self.generator.random_number(digits=num_digits, fix_len=True))

    def uspto_reissue(self) -> str:
        # Reissue : (e.g., Rennnnnn, RE000126) must enter leading zeroes between "RE"
        # and number to create 6 digits.
        return f"RE{self.generator.random_number(digits=6):06}"

    def uspto_plant(self) -> str:
        # Plant Patents :(e.g., PPnnnnnn, PP000126) must enter leading zeroes between
        # "PP" and number to create 6 digits.
        return f"PP{self.generator.random_number(digits=6):06}"

    def uspto_design(self) -> str:
        # Design : (e.g., Dnnnnnnn, D0000126) must enter leading zeroes between "D" and
        # number to create 7 digits.
        return f"D{self.generator.random_number(digits=7):07}"

    def uspto_ai(self) -> str:
        # Additions of Improvements : (e.g., AInnnnnn , AI000126) must enter leading
        # zeroes between "AI" and number tocreate 6 digits.
        return f"AI{self.generator.random_number(digits=6):06}"

    def uspto_x(self) -> str:
        # X Patents : (e.g., Xnnnnnnn , X0000001) must enter leading zeroes between "X"
        # and number to create 7 digits.
        return f"X{self.generator.random_number(digits=7):07}"

    def uspto_h(self) -> str:
        # H Documents : (e.g., Hnnnnnnn , H0000001) must enter leading zeroes between
        # "H" and number to create 7 digits.
        return f"H{self.generator.random_number(digits=7):07}"

    def uspto_t(self) -> str:
        # T Documents : (e.g., Tnnnnnnn , T0000001) must enter leading zeroes between
        # "T" and number to create 7 digits.
        return f"T{self.generator.random_number(digits=7):07}"

created time in 3 months

issue openedjoke2k/faker

Please support weighted choices

Since Python 3.6, random.choices has supported passing in weights for the choices, which would allow for more realistic generation of data sets like random languages (#1179). It would be nice of fake.random_elements and its relations could be updated to support this.

created time in 3 months

issue commentjoke2k/faker

Provide random gender

@chobeat sex (physical) and gender (cultural/societal) are not the same thing. While there are quite a few human sexes (though only male/female recognized by most outside of medical science, and many intersex individuals don't actually know that they are anything other than male/female), there are an order of magnitude more recognized genders (just look at Facebook, which includes more than 50 options).

svenstaro

comment created time in 3 months

issue openedjoke2k/faker

Feature request: safe_domain_name()

There are methods for safe_email() but nothing that just returns the a rfc6761 domain name for those cases where a user might want to provide a username generated off of other data (e.g. a faker-generated first/last name combo so it doesn't look weird that George Smith has an email address of janedoe@example.com).

I know I can get this with fake.random_element(["example.com", "example.net", "example.org"]) but it would be nice to have this built in.

created time in 3 months

issue commentjoke2k/faker

New feature: language_name

This information is conveniently available, including percentages of natural speakers, which could allow for semi-realistic generation of data: https://en.wikipedia.org/wiki/List_of_languages_by_number_of_native_speakers

ikhomutov

comment created time in 3 months

push eventex-nerd/DefinitelyTyped

Chris Petersen

commit sha c698b7bd669810b1e04a78df0243f3a86b1fb2b3

Fix typo in aql.join() definition join() takes more than one kind of value not just ArangoDB.Query.

view details

push time in 3 months

PR opened DefinitelyTyped/DefinitelyTyped

Add a few new arangodb properties
  • Add persistent IndexType
  • Add aql.join()
  • Add name and minLength to IndexDescription
+4 -1

0 comment

1 changed file

pr created time in 3 months

push eventex-nerd/DefinitelyTyped

Chris Petersen

commit sha 291b1520e4b7d5e280e7b09f8364f1bc074a7c2e

Add a few new arangodb properties - Add persistent IndexType - Add aql.join() - Add name and minLength to IndexDescription

view details

push time in 3 months

fork ex-nerd/DefinitelyTyped

The repository for high quality TypeScript type definitions.

fork in 3 months

more