profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/snowman2/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Alan D. Snow snowman2 @corteva https://www.linkedin.com/in/alan-snow-55bb8726/ My expertise is a mix of programming, GIS, hydrology, and web development.

pyproj4/pyproj 623

Python interface to PROJ (cartographic projections and coordinate transformations library)

corteva/geocube 103

Tool to convert geopandas vector data into rasterized xarray data.

erdc/proteus 63

A computational methods and simulation toolkit

geoxarray/geoxarray 50

Geolocation utilities for xarray

erdc/pangaea 11

An xarray extension for gridded land surface & weather model output.

erdc/RAPIDpy 11

RAPIDpy is a python interface for RAPID that assists to prepare inputs, runs the RAPID program, and provides post-processing utilities.

ContinuumIO/whitebox-geospatial-analysis-tools 6

An open-source GIS and remote sensing package

erdc/AutoRoutePy 6

python-based interface for AutoRoute

erdc/spt_compute 5

Runs streamflow forecasts using ECMWF predicted runoff and RAPID (Forked from: https://github.com/CI-WATER/erfp_data_process_ubuntu_aws).

erdc/tethysapp-streamflow_prediction_tool 5

Web app for displaying streamflow predictions using a GIS based interface (Forked from: https://github.com/CI-WATER/tethysapp-erfp_tool).

push eventOSGeo/PROJ

Even Rouault

commit sha 9a2f2ed95074df7e22d89521a93ccccc4abfd789

Database: update to EPSG v10.027

view details

Even Rouault

commit sha c959db170fe05b6c0bdff84876d59a8d425bb4b1

Merge pull request #2751 from rouault/epsg_10_027 Database: update to EPSG v10.027

view details

push time in 12 hours

PR merged OSGeo/PROJ

Database: update to EPSG v10.027
+31 -20

0 comment

8 changed files

rouault

pr closed time in 12 hours

PR opened OSGeo/PROJ

Database: update to EPSG v10.027
+31 -20

0 comment

8 changed files

pr created time in 14 hours

pull request commentconda-forge/rasterio-feedstock

Rebuild for gdal33

Hi! This is the friendly conda-forge automerge bot!

I considered the following status checks when analyzing this PR:

  • linter: passed
  • azure: failed

Thus the PR was not passing and not merged.

xylar

comment created time in 2 days

pull request commentconda-forge/rasterio-feedstock

Rebuild for gdal33

Looks like things are working now so I've set it to automerge.

xylar

comment created time in 2 days

pull request commentconda-forge/rasterio-feedstock

Rebuild for gdal33

@ocefpaf, it seems like we need to add the migrator for proj back and then rerender. I can do that.

Ah yes! Sorry I missed that.

xylar

comment created time in 2 days

pull request commentconda-forge/rasterio-feedstock

Rebuild for gdal33

@conda-forge-admin, please rerender

xylar

comment created time in 2 days

pull request commentconda-forge/rasterio-feedstock

Rebuild for gdal33

@ocefpaf, it seems like we need to add the migrator for proj back and then rerender. I can do that.

xylar

comment created time in 2 days

pull request commentconda-forge/rasterio-feedstock

Rebuild for gdal33

Builds with proj 7.2.0 aren't solvable so that's presumably why the bot hasn't done this migration yet.

We can drop those for now. It is not sustainable to keep the older version for long. I'll push a few commits to your branch.

xylar

comment created time in 2 days

Pull request review commentOSGeo/PROJ

S2 projection

 PJ_XY pj_fwd(PJ_LP lp, PJ *P) {      last_errno = proj_errno_reset(P); -    if (!P->skip_fwd_prepare)+    if (!P->skip_fwd_prepare && 0 != strcmp(P->short_name, "s2"))

oops sorry, it should have been:

P->left = PJ_IO_UNITS_RADIANS; 
P->right = PJ_IO_UNITS_PROJECTED;
marcus-elia

comment created time in 2 days

Pull request review commentOSGeo/PROJ

S2 projection

 PJ_XY pj_fwd(PJ_LP lp, PJ *P) {      last_errno = proj_errno_reset(P); -    if (!P->skip_fwd_prepare)+    if (!P->skip_fwd_prepare && 0 != strcmp(P->short_name, "s2"))

This has introduced an error of can't initialize operations that take non-angular input coordinates. Is there a different solution I could try?

marcus-elia

comment created time in 2 days

push eventOSGeo/PROJ

Even Rouault

commit sha 5f6fac3afdb03aef5a26ce5f60c10ed612fa6fc2

proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one Relates to https://github.com/OSGeo/gdal/issues/3998 Before that change, cs2cs on a NAD83(HARN) to WGS84 transformation would use the "NAD83(HARN) to WGS 84 (1)" transformation (a null Helmert shift) that is valid for whole US, including non-CONUS areas, even when used on points located on CONUS that has a "NAD83(HARN) to WGS 84 (3)" transformation (non-null Helmert shift) with same accuracy (1m). But if doing EPSG:2874 "NAD83(HARN) / California zone 5 (ftUS)" to WGS84, we would use this later "NAD83(HARN) to WGS 84 (3)" transformation because the area of use of EPSG:2874 restricts to CONUS. This isn't consistant. With that change, we now have more consistent behavior, even if it can be argued which of the 2 transformations is the best... $ echo 34 -120 | src/cs2cs -d 8 EPSG:4326 "NAD83(HARN)" | src/cs2cs "NAD83(HARN)" EPSG:2874 5955507.74 1828410.98 0.00 $ echo 34 -120 | src/cs2cs EPSG:4326 EPSG:2874 5955507.74 1828410.98 0.00

view details

Even Rouault

commit sha 916eaa4349e2f46320d49ae0f9d2993d98a8335d

operations_computation.rst: add note about proj_create_crs_to_crs not necessarily using the operation that appears as first

view details

Even Rouault

commit sha a79b5558da1a391a9d7418316c9f507583d41b2a

Merge pull request #2750 from rouault/better_operation_selection proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one

view details

push time in 2 days

PR merged OSGeo/PROJ

proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one

Relates to https://github.com/OSGeo/gdal/issues/3998

Before that change, cs2cs on a NAD83(HARN) to WGS84 transformation would use the "NAD83(HARN) to WGS 84 (1)" transformation (a null Helmert shift) that is valid for whole US, including non-CONUS areas, even when used on points located on CONUS that has a "NAD83(HARN) to WGS 84 (3)" transformation (non-null Helmert shift) with same accuracy (1m).

But if doing EPSG:2874 "NAD83(HARN) / California zone 5 (ftUS)" to WGS84, we would use this later "NAD83(HARN) to WGS 84 (3)" transformation because the area of use of EPSG:2874 restricts to CONUS. This isn't consistant.

With that change, we now have more consistent behavior, even if it can be argued which of the 2 transformations is the best...

$ echo 34 -120 | src/cs2cs -d 8 EPSG:4326 "NAD83(HARN)" | src/cs2cs "NAD83(HARN)" EPSG:2874 5955507.74 1828410.98 0.00

$ echo 34 -120 | src/cs2cs EPSG:4326 EPSG:2874 5955507.74 1828410.98 0.00

+33 -4

4 comments

4 changed files

rouault

pr closed time in 2 days

pull request commentOSGeo/PROJ

proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one

I've just added a note in operations_computation.rst about that

rouault

comment created time in 2 days

pull request commentOSGeo/PROJ

proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one

Okay, thanks for the clarification. Just wanted to make sure this was remembered in case the logic changed.

rouault

comment created time in 2 days

pull request commentOSGeo/PROJ

proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one

Does this PR change the logic of what's described in https://proj.org/operations/operations_computation.html ?

no, this page explains how we compute and sort operations. Here the change is about how we use the resulted sorted list of operations, and we don't necessarily use the operations that was ranked first, since when we have coordinates we have more context to be able to pick up one that seems more relevant.

rouault

comment created time in 2 days

pull request commentOSGeo/PROJ

proj_trans/cs2cs: If two operations have the same accuracy, use the one that is contained within a larger one

Seems like a sensible change to me. Does this PR change the logic of what's described in https://proj.org/operations/operations_computation.html ?

rouault

comment created time in 2 days

push eventopendatacube/datacube-core

Kirill Kouzoubov

commit sha 20b532ab1443bf9fdb4083d599e537d6a7cd2fa6

Better caching when requirements change also remove no longer needed NOBINARY env also ensure V_PG is properly declared

view details

Kirill Kouzoubov

commit sha a4ce8de9ad55e3500e637f0e87533473c3152317

Use buildx with caching for building test docker

view details

Kirill Kouzoubov

commit sha ea2051e372391fdb39f00e3fb566e2fe18e26a6c

Do not duplicate requirements information constraints.in is now requirements.txt

view details

Kirill Kouzoubov

commit sha efe29cfb5852f4ac4d5cdb66916d1ab484a54a72

Make sure twine and wheel are installed

view details

Kirill Kouzoubov

commit sha 3c8368a8795c2ccabeb8626697c7ccde68abd8e2

Do not use confusing variable name 'l'

view details

Kirill Kouzoubov

commit sha 65fe46204558fe480edde9b1726861efa39642eb

Fix test to work with newer hypothesis #1067 Can't combine `given` with function scoped fixtures, so use session scoped fixture instead

view details

Kirill Kouzoubov

commit sha e0e069769d8358fe8ab38260ec3879ce6cef9c65

Apply black formatter

view details

Kirill Kouzoubov

commit sha 15bc7bef5f36cfe8240a634b331e9375c7c75ac9

Bump hypothesis lib in the test runner

view details

Kirill Kouzoubov

commit sha 7ae0c52d33b22f109446992bd2f46933736d9ddb

Check arguments when constructing GridWorkflow class #1077

view details

Kirill Kouzoubov

commit sha 5c400a88d73583f32f238d6b6d9eaf80527dfed2

Update docs to use eo3 for sample product #1063

view details

Kirill Kouzoubov

commit sha 60c003938dd6cd895560c3f74121278c998c0a78

Smoke test str/repr of GridWorkflow classes

view details

Kirill Kouzoubov

commit sha 5ffbf159a0354d557bf18a53c92a3a3e2f3914be

Test corner case of GridWorkflow missing product from query

view details

Kirill Kouzoubov

commit sha 77f40c0ab9c2d82244ea4a660b3aa0fd85eebadb

Add xfail test for UUID without quotes parsing

view details

Kirill Kouzoubov

commit sha 99a99e11b364f27075d206b8a424102865ad8f40

Parse UUIDs in search grammar #1029

view details

Kirill Kouzoubov

commit sha a63e4f5dd32abbf2a115fefadd88dfbaa4457e62

Allow search terms like region_code=56KKD Simple string can start with a digit so long as it is followed by non-digit character afterwards

view details

Kirill Kouzoubov

commit sha 0826d429be6a5802f000f7a10ad669349a525ebf

Raise error when time= query contains too many arguments #1033

view details

Kirill Kouzoubov

commit sha 4d3b59d5b79dc0d337a664195a885f5e78a5ee71

Raise less confusing error when product is misspelled #1015

view details

Kirill Kouzoubov

commit sha 14dd773c5917f8da1280a075327e706fb87e509e

Three element time "range" is not a valid input anymore remove it from test data

view details

Peter Wang

commit sha f5dd19bc76f4093fb67dc7e755baccf379f3970a

Update tests for new version of boto

view details

Peter Wang

commit sha 6aaba7abc5bc6543ce4ea322e120712440e3e0b8

First lot of changes to update ODC for 3D

view details

push time in 2 days

pull request commentopendatacube/datacube-core

3d datasets

Codecov Report

Merging #1099 (4c3202c) into develop (d5f7618) will increase coverage by 0.10%. The diff coverage is 96.78%.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop    #1099      +/-   ##
===========================================
+ Coverage    93.78%   93.88%   +0.10%     
===========================================
  Files          102      102              
  Lines        10106    10294     +188     
===========================================
+ Hits          9478     9665     +187     
- Misses         628      629       +1     
Impacted Files Coverage Δ
datacube/storage/_base.py 100.00% <ø> (ø)
datacube/api/core.py 97.11% <93.47%> (-1.31%) :arrow_down:
datacube/model/__init__.py 97.30% <99.00%> (+0.37%) :arrow_up:
datacube/api/query.py 93.06% <100.00%> (+0.36%) :arrow_up:
datacube/index/_products.py 94.32% <100.00%> (+0.04%) :arrow_up:
datacube/index/hl.py 100.00% <100.00%> (ø)
datacube/storage/_load.py 100.00% <100.00%> (ø)
datacube/storage/_read.py 100.00% <100.00%> (ø)
datacube/ui/expression.py 78.26% <0.00%> (ø)
datacube/index/_datasets.py 94.73% <0.00%> (+0.04%) :arrow_up:
... and 3 more

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update d5f7618...4c3202c. Read the comment docs.

petewa

comment created time in 2 days

push eventopendatacube/datacube-core

Peter Wang

commit sha 4c3202c70c68f60c58cc14a1b88a22b95c13c21a

Address PR (remove auto-generated 2D measurements, measurement_map and alias_map) spectral_definition now supports a list of definitions. Added --allow-multiple-documents to .pre-commit-config.yaml to allow multi-document definitions Docs to follow

view details

push time in 2 days

Pull request review commentcorteva/rioxarray

dtype for GDAL CInt16, rasterio complex_int16

 def to_raster(self, xarray_dataarray, tags, windowed, lock, compute, **kwargs):         **kwargs             Keyword arguments to pass into writing the raster.         """-        dtype = kwargs["dtype"]+        numpy_dtype = kwargs["dtype"]+        if xarray_dataarray.encoding.get("_rasterio_dtype") == "complex_int16":+            kwargs["dtype"] = "complex_int16"

we will need to have a way to get the rasterio and numpy dtype. For the writer, I think that it is best to assume that the "dtype" kwarg could be either the rasterio or numpy dtype and automatically determine the numpy/rasterio dtype from it.

i think the only special dtype is complex_int16 so i went with this approach, let me know if you prefer something else. tests pass except for the new one (requires pip install git+https://github.com/mapbox/rasterio.git@maint-1.2, fixes aren't yet in the master branch

scottyhq

comment created time in 2 days

Pull request review commentcorteva/rioxarray

dtype for GDAL CInt16, rasterio complex_int16

 def to_raster(self, xarray_dataarray, tags, windowed, lock, compute, **kwargs):         **kwargs             Keyword arguments to pass into writing the raster.         """-        dtype = kwargs["dtype"]

Here it is:

ah. thanks... still learning how all the pieces fit together!

scottyhq

comment created time in 2 days

Pull request review commentcorteva/rioxarray

dtype for GDAL CInt16, rasterio complex_int16

 def to_raster(self, xarray_dataarray, tags, windowed, lock, compute, **kwargs):         **kwargs             Keyword arguments to pass into writing the raster.         """-        dtype = kwargs["dtype"]

it isn't obvious to me where the default kwargs (dtype) comes from here (from a user's perspective calling this function without any additional arguments da.rio.to_raster('raster.tif')

scottyhq

comment created time in 2 days

Pull request review commentcorteva/rioxarray

dtype for GDAL CInt16, rasterio complex_int16

 def test_non_rectilinear__skip_parse_coordinates(open_rasterio):     assert xds.rio.shape == (10, 10)     with rasterio.open(test_file) as rds:         assert rds.transform == xds.rio.transform()+++def test_complex_dtype(tmp_path):+    test_file = os.path.join(TEST_INPUT_DATA_DIR, "cint16.tif")+    xds = rioxarray.open_rasterio(test_file)+    assert xds.rio.shape == (100, 100)+    assert xds.dtype == "complex64"++    # NOTE: waiting on https://github.com/mapbox/rasterio/issues/2206+    # tmp_output = tmp_path / "tmp_cint16.tif"+    # with pytest.warns(SerializationWarning):+    #    xds.rio.to_raster(str(tmp_output), dtype='complex_int16')

i attempted to do this, let me know if the revisions are what you had in mind

scottyhq

comment created time in 2 days

issue commentOSGeo/PROJ

proj 8.0.0 installation error database is locked

Did you identify what caused the "database locked" error ?

Gopal-Murali

comment created time in 3 days

issue closedOSGeo/PROJ

proj 8.0.0 installation error database is locked

Hello,

I am trying to install PROJ 8.0.0 on a server that runs on Ubuntu 18.04.5 LTS.

I have already compiled and installed sqlite3 locally and added it to the PATH/LD_LIBRARY_PATH/PKG_CONFIG_PATH. However, when I tried to install proj-8.0.0 (using the ./configure and install commands) following instructions here, I am getting the following error.

Make proj.db
rm -f proj.db
Error: near line 7: database is locked
Error: near line 12: database is locked
Error: near line 23: database is locked
Error: near line 31: database is locked
Error: near line 33: database is locked
Error: near line 52: database is locked
Error: near line 59: database is locked
Error: near line 73: database is locked
Error: near line 81: database is locked
Error: near line 100: database is locked
Error: near line 102: database is locked

Environment Information

  • PROJ version: 8.0.0

Installation method

-from source

closed time in 3 days

Gopal-Murali

issue commentOSGeo/PROJ

proj 8.0.0 installation error database is locked

Thanks for the help. I did manage to install it.

Gopal-Murali

comment created time in 3 days

Pull request review commentOSGeo/PROJ

S2 projection

 pj_init_ctx_with_allow_init_epsg(PJ_CONTEXT *ctx, int argc, char **argv, int all         return pj_default_destructor (PIN, PROJ_ERR_INVALID_OP_ILLEGAL_ARG_VALUE);     } +    /* S2 projection parameter */

Perfect, thanks!

marcus-elia

comment created time in 3 days

Pull request review commentOSGeo/PROJ

S2 projection

 pj_init_ctx_with_allow_init_epsg(PJ_CONTEXT *ctx, int argc, char **argv, int all         return pj_default_destructor (PIN, PROJ_ERR_INVALID_OP_ILLEGAL_ARG_VALUE);     } +    /* S2 projection parameter */

When you say the setup function of the projection, are you referring to the block of code beginning here?

yes

If so, I'm wondering if you have another example of a projection that parses parameters outside of src/init.cpp, as it doesn't appear that ctx gets passed through to that code in order to call pj_param properly.

It is available in P->ctx. See e.g. https://github.com/OSGeo/PROJ/blob/master/src/projections/lcc.cpp#L89

marcus-elia

comment created time in 3 days

Pull request review commentOSGeo/PROJ

S2 projection

 pj_init_ctx_with_allow_init_epsg(PJ_CONTEXT *ctx, int argc, char **argv, int all         return pj_default_destructor (PIN, PROJ_ERR_INVALID_OP_ILLEGAL_ARG_VALUE);     } +    /* S2 projection parameter */

@rouault - I've been working with @marcus-elia on this. When you say the setup function of the projection, are you referring to the block of code beginning here? https://github.com/OSGeo/PROJ/blob/a1b640a8f019ebebf2112d26f1c1ada39ef49194/src/projections/s2.cpp#L310-L314 If so, I'm wondering if you have another example of a projection that parses parameters outside of src/init.cpp, as it doesn't appear that ctx gets passed through to that code in order to call pj_param properly.

marcus-elia

comment created time in 3 days