A metadata catalog and search engine for geospatialized data

Related tags

Search resto
Overview

About

Build Average time to resolve an issue Percentage of issues still open

resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

resto search API conforms to the SpatioTemporal Asset Catalog (STAC) specification v1.0.0 and to the CEOS OpenSearch Best Practice Document.

It is mentioned in ESA's "Exploitation Platform Common Core Components" as the closest implementation of a catalogue component according to the requirements specified in ESA's "Exploitation Platform Open Architecture"

Demo

The [https://tamn.snapplanet.io] resto server provides up to date access to Landsat-8 and Sentinel-2 images.

You can browse it with the rocket web client

References

Here are some projects that use resto.

If you plan to use resto and would like to have your project added to this list, feel free to contact support

Support

resto is developped and maintained by jeobrowser.

For questions, support or anything related to resto feel free to contact

    jeobrowser
    50 quai de Tounis
    31000 Toulouse
    Tel   : +33 6 19 59 17 35
    email : [email protected]

Installation

After reviewing your configuration file, run one of following command:

(for production)

    ./deploy

(for development)

    ./deploy -e config-dev.env

The INSTALLATION.md file provides additional information on the installation process.

Examples

Comments
  • Names and structures of database entities repeated

    Names and structures of database entities repeated

    Over here is a bunch of DDL that creates tables:

    https://github.com/jjrom/resto/blob/a7ef168f961ed55b9d747212c9b9ed60abb2638c/_install/installDB.sh#L225

    Over here is PHP that generates SQL that queries those tables:

    https://github.com/jjrom/resto/blob/a7ef168f961ed55b9d747212c9b9ed60abb2638c/include/resto/RestoLicense.php#L93

    This seems to violate the Don't Repeat Yourself principle. It's a so-called WET approach - "Write Everything Twice"/"We Enjoy Typing"/"Waste Everyone's Time".

    If the model classes were taught how to generate DDL to define the tables they represent, and called from installDB.sh instead of open-coding the same entity names, this needless repetition would be fixed.

    enhancement 
    opened by ukyg9e5r6k7gubiekd6 5
  • Please use placeholders and remove all modification of data before it reaches the RDBMS

    Please use placeholders and remove all modification of data before it reaches the RDBMS

    Is there a particular reason why this, and other code, isn't using placeholders?

    https://github.com/jjrom/resto/blob/a7ef168f961ed55b9d747212c9b9ed60abb2638c/include/resto/Drivers/RestoDatabaseDriver_PostgreSQL.php#L598

    Also, this is expensive - it round-trips to the RDBMS simply to do some string munging.

    Also, there's no need to munge string data before it goes to the database. It seems the only use of this is to escape data so it can be substituted into the middle of SQL by callers. If those callers used placeholders (as they should), then this code wouldn't be required at all.

    Also, why does this remove accents? There's nothing inherently wrong with searching for text including accents. Why prevent the user from searching for whatever takes their fancy today? Par exemple, pourqoi devrais-je être empêché de chercher le mot "Géographie"? Warum sollte ich nie nach "Flüsse" suchen? I can't even begin to fathom how this approach would fare when faced with a language like Vietnamese.

    Here's a good explanation of why you should use prepared statements and placeholders when accessing an RDBMS from code you're writing:

    http://use-the-index-luke.com/sql/where-clause/bind-parameters

    Here's the documentation on the function this code is using:

    http://php.net/manual/en/function.pg-query.php

    Notice how it says close to the top of the page:

    pg_query_params() should be preferred in most cases.

    This code and its callers are all such cases.

    enhancement 
    opened by ukyg9e5r6k7gubiekd6 5
  • pystac-client request with intersects fails against resto

    pystac-client request with intersects fails against resto

    Describe the bug pystac_client seems to have an issue when submitting a geometry to a resto /search endpoint

    To Reproduce

    from pystac_client import Client
    client = Client.open("https://tamn.snapplanet.io")
    collection = "S2"
    geometry = '{"type": "Polygon", "coordinates": [[[100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0]]]}'
    search = client.search(method="GET", collections=[collection], intersects=geometry)
    print(len(list(search.items_as_dicts())))
    

    Exception is thrown:

    Traceback (most recent call last):
      File "<input>", line 1, in <module>
        print(len(list(search.items_as_dicts())))
      File "/Users/philvarner/code/stac-api-validator/.venv/lib/python3.10/site-packages/pystac_client/item_search.py", line 682, in items_as_dicts
        for page in self._stac_io.get_pages(
      File "/Users/philvarner/code/stac-api-validator/.venv/lib/python3.10/site-packages/pystac_client/stac_api_io.py", line 220, in get_pages
        page = self.read_json(url, method=method, parameters=parameters)
      File "/Users/philvarner/code/stac-api-validator/.venv/lib/python3.10/site-packages/pystac/stac_io.py", line 198, in read_json
        txt = self.read_text(source, *args, **kwargs)
      File "/Users/philvarner/code/stac-api-validator/.venv/lib/python3.10/site-packages/pystac_client/stac_api_io.py", line 97, in read_text
        return self.request(href, *args, **kwargs)
      File "/Users/philvarner/code/stac-api-validator/.venv/lib/python3.10/site-packages/pystac_client/stac_api_io.py", line 144, in request
        raise APIError.from_response(resp)
    pystac_client.exceptions.APIError: {"ErrorMessage":"search - pQuery - ","ErrorCode":400}
    

    Filing this against resto instead of pystac-client as I know this works with other STAC APIs, but it could be an issue with pystac-client.

    Expected behavior

    The results are paged over and the total count is printed.

    opened by philvarner 4
  • Search by id?

    Search by id?

    I'm trying to find out download URLs on PEPS by product id. This only seems to work for S1: https://peps.cnes.fr/resto/api/collections/S1/search.json?identifier=S1A_IW_GRDH_1SDH_20160723T101221_20160723T101250_012274_013151_16AE gives me back the correct result, but doing the same for S2 or S3 returns nothing: https://peps.cnes.fr/resto/api/collections/S2/search.json?maxRecords=1&identifier=S2A_MSIL1C_20160222T065901_N0204_R063_T39PUS_20180205T191150 or https://peps.cnes.fr/resto/api/collections/S3/search.json?maxRecords=2&identifier=S3A_OL_1_EFR____20180403T155615_20180403T155915_20180403T175343_0179_029_325_2340_SVL_O_NR_002

    (I realize resto authors have no direct control of PEPS, but I thought I'll ask here first to make sure I'm sending correct queries.)

    opened by simonff 4
  • Redundant ErrorCode in JSON response and HTTP status

    Redundant ErrorCode in JSON response and HTTP status

    This is redundant:

    https://github.com/jjrom/resto/blob/a7ef168f961ed55b9d747212c9b9ed60abb2638c/include/resto/Resto.php#L169

    HTTP already defines a status, which communicates to the client the success/failure of the client's request. HTTP also defines a body, wherein the server can include a lengthy human-readable description of what went well or badly, why, and what the client's user might do about that.

    Given that, there is no need for an attribute 'ErrorCode' in returned JSON responses. It adds nothing to the information already present. Worse, this redundancy in the protocol allows RESTo to respond in apparently contradictory and possibly meaningless ways.

    What does it mean if RESTo returns an HTTP status of 404 ("Not found"), but the HTTP body contains data that looks like the data I requested, with an ErrorCode indicating success? Is the 404 wrong (should be 200), because this is a successful request? Or should I trust the 404 (what I asked for doesn't exist)? In that case, what's this data in the body? Is it some other data that I didn't ask for? Or if this circumstance is impossible, why does RESTo's protocol permit it?

    What would it mean if RESTo returned "200 Ok", but then an HTTP body containing a JSON response containing an ErrorCode and not containing any data? How is this a success? Where is my data?

    How am I to write a correct client for RESTo? I suppose I should:

    1. Check the HTTP status. If it indicates an error, assume there has been an error, and attempt to handle it (perhaps, display a diagnostic). Stop.
    2. Otherwise, check the content type. If it's not what I expected (eg not JSON), assume there's been an error, handle it, then stop.
    3. Otherwise, if we got a JSON response, look for an ErrorCode attribute. If it's present, and it indicates an error, then assume the HTTP status was a lie, and there really has been an error, so attempt to handle the error, then stop.
    4. Otherwise, if there's no ErrorCode or it indicates success, then go looking for data. If we find it, assume it's the data we want; use those data, then stop.
    5. Otherwise, if there's no data, then assume that both the HTTP status and the ErrorCode are lies, and that something went wrong. But we have no idea what went wrong, so we can't really handle it meaningfully.

    This is unnecessarily complex and difficult. Frameworks like Angular don't support this strange algorithm out of the box, so custom development is required. Novice Javascript programmers will almost certainly get such development wrong.

    Using HTTP in general, and specifically using its status mechanism, is one of the core principles of REST API design.

    enhancement 
    opened by ukyg9e5r6k7gubiekd6 4
  • HTTP status numbers need to be within ranges given in HTTP RFCs

    HTTP status numbers need to be within ranges given in HTTP RFCs

    https://github.com/jjrom/resto/blob/a7ef168f961ed55b9d747212c9b9ed60abb2638c/include/resto/Utils/RestoLogUtil.php#L81

    These need to be within the existing ranges, of size 100, defined by HTTP in order for the client to understand them. If they are not, clients are within specification to fail to understand these invented status codes. See for example RFC 2616 section 6.1.1, especially the last paragraph.

    Note, RFC 2616 does not require you to use one of the values listed in RFC 2616 section 6.1.1. You're free to invent your own HTTP statuses, if you truly believe that not a single one of the existing statuses is applicable (which would be surprising). But even in that case you should stay within whichever one of the given ranges is applicable.

    enhancement 
    opened by ukyg9e5r6k7gubiekd6 4
  • datetime filter disallows some valid strings and allows some invalid strings

    datetime filter disallows some valid strings and allows some invalid strings

    The following were found running stac-api-validator. There is a description on parsing datetimes in this PR https://github.com/radiantearth/stac-api-spec/pull/162/files

    The error response says:

    "validateFilterString - Value for \"datetime\" must follow the pattern ^[a-zA-Z0-9\\-\\/\\.\\:]+$"
    

    though this is not a correct regex for RFC3339 datetimes.

    These datetime string used as a parameter to /search resulted in a 400, when they should be accepted:

    • Search with datetime=1985-04-12T23:20:50,52Z returned status code 400
    • Search with datetime=1996-12-19T16:39:57+00:00 returned status code 400
    • Search with datetime=1996-12-19T16:39:57+08:00 returned status code 400
    • Search with datetime=../1985-04-12T23:20:50.52Z returned status code 400
    • Search with datetime=1985-04-12T23:20:50.52Z/.. returned status code 400
    • Search with datetime=1985-04-12T23:20:50.52+01:00/1986-04-12T23:20:50.52Z+01:00 returned status code 400
    • Search with datetime=1985-04-12T23:20:50.52-01:00/1986-04-12T23:20:50.52Z-01:00 returned status code 400
    • Search with datetime=1937-01-01T12:00:27.87+01:00 returned status code 400
    • Search with datetime=1937-01-01T12:00:27.8710+01:00 returned status code 400
    • Search with datetime=1937-01-01T12:00:27.8+01:00 returned status code 400
    • Search with datetime=2020-07-23T00:00:00.000+03:00 returned status code 400
    • Search with datetime=2020-07-23T00:00:00+03:00 returned status code 400
    • Search with datetime=1985-04-12t23:20:50.000z returned status code 400
    • Search with datetime=1985-04-12 returned status code 200 instead of 400

    These datetimes should not be accepted (because they are missing timezones), but were allowed:

    • Search with datetime=1985-12-12T23:20:50.52 returned status code 200 instead of 400
    • Search with datetime=1986-04-12T23:20:50.52Z/1985-04-12T23:20:50.52Z returned status code 200 instead of 400
    opened by philvarner 3
  • API spec endpoint contains invalid servers entry, causing OGC API validation to fail

    API spec endpoint contains invalid servers entry, causing OGC API validation to fail

    The API spec endpoint https://tamn.snapplanet.io/api contains a field:

    "servers": [
        {
          "url": "http://127.0.0.1:5252",
          "description": "resto localhost server"
        }
      ],
    

    When the OGC API Features Part 1 validator (https://cite.opengeospatial.org/teamengine/about/ogcapi-features-1.0/1.0/site/) is run against this, it uses the entries in server to run the tests (e.g., it uses the root endpoint to find the api spec, gets the api spec, and then runs everything else against the servers in the api spec instead of the root endpoint).

    Since localhost isn't a valid endpoint to run this (most of the time), the test "validate Conformance Operation And Response" fails, which then cascades to skipping most of the other tests.

    opened by philvarner 3
  • Please don't use the term

    Please don't use the term "REST" when you don't really mean what it means

    How about just calling this an "HTTP API" instead? That's accurate and honest.

    https://github.com/jjrom/resto/blob/a7ef168f961ed55b9d747212c9b9ed60abb2638c/include/resto/RestoRoute.php#L19

    The inventor of the term "REST", Roy Fielding, gets unhappy when people use the term "REST" to describe things like this which aren't REST, in this case (and many others) because they don't implement HATEAOS. Please don't make Roy even more unhappy than he already is.

    Here is Roy's dissertation where he coins the term "REST".

    Here is Roy bemoaning people (like the authors of this software) misusing his terminology, and wondering why they (wrongly) think that HATEAOS is an optional part of REST, or not a constraint, or something unimportant. It's almost like they didn't actually read and understand his dissertation...

    opened by ukyg9e5r6k7gubiekd6 3
  • Search with limit above max advertised returns 400 instead of 200

    Search with limit above max advertised returns 400 instead of 200

    This was recently changed in 1.0.0-rc.2 to explicitly say that a 400 had to be returned if the limit value was above the max advertised, but then we realized this was inconsistent in OAFeat and has been clarified in in an unreleased version of that spec such that an error should not be returned, and instead the limit should be treated as if it were the maximum.

    opened by philvarner 2
  • Null spatial bbox and temporal interval in collection

    Null spatial bbox and temporal interval in collection

    Describe the issue After registering a collection, and retrieving it, the fields spatial.bbox and temporal.interval are null

    The registered collection is:

    {
        "stac_version": "1.0.0",
        "id": "S2",
        "title": "S2 L1C",
        "type": "Collection",
        "license": "public",
        "description": "Sentinel-2a and Sentinel-2b imagery, processed to Level 2A (Surface Reflectance) and converted to Cloud Optimized GeoTIFFs",
        "providers": [],
        "extent": {
            "temporal": {
                "interval": [
                    [
                        "2016-11-02T10:01:49Z",
                        "2022-10-10T12:07:44Z"
                    ]
                ]
            },
            "spatial": {
                "bbox": [
                    [
                        -27.00017208992278,
                        -47.69149719793699,
                        65.01059281986201,
                        38.84898033153399
                    ]
                ]
            }
        }
    }
    

    The request is copied (more or less) from the Africa datacube.

    Retrieving the request, the following collection is obtained

        "stac_version": "1.0.0",
        "id": "resto",
        "type": "Catalog",
        "title": "resto search service",
        "description": "Search on all collections",
        "keywords": [
            "resto"
        ],
        "links": [
            {
                "rel": "self",
                "type": "application/json",
                "href": "http://127.0.0.1:5252/collections"
            },
            {
                "rel": "root",
                "type": "application/json",
                "href": "http://127.0.0.1:5252"
            },
            {
                "rel": "items",
                "title": "All collections",
                "matched": 0,
                "type": "application/geo+json",
                "href": "http://127.0.0.1:5252/search"
            },
            {
                "rel": "child",
                "type": "application/json",
                "title": "S2 L1C",
                "description": "Sentinel-2a and Sentinel-2b imagery, processed to Level 2A (Surface Reflectance) and converted to Cloud Optimized GeoTIFFs",
                "matched": 0,
                "href": "http://127.0.0.1:5252/collections/S2",
                "roles": [
                    "collection"
                ]
            }
        ],
        "extent": {
            "spatial": {
                "bbox": [
                    null
                ],
                "crs": "http://www.opengis.net/def/crs/OGC/1.3/CRS84"
            },
            "temporal": {
                "interval": [
                    [
                        null,
                        null
                    ]
                ],
                "trs": "http://www.opengis.net/def/uom/ISO-8601/0/Gregorian"
            }
        },
        "resto:info": {
            "osDescription": {
                "ShortName": "resto",
                "LongName": "resto search service",
                "Description": "Search on all collections",
                "Tags": "resto",
                "Developer": "Jérôme Gasperi",
                "Contact": "[email protected]",
                "Query": "europe 2015",
                "Attribution": "Copyright 2018, All Rights Reserved"
            }
        },
        "collections": [
            {
                "stac_version": "1.0.0",
                "stac_extensions": [],
                "id": "S2",
                "type": "Collection",
                "title": "S2 L1C",
                "version": "1.0.0",
                "description": "Sentinel-2a and Sentinel-2b imagery, processed to Level 2A (Surface Reflectance) and converted to Cloud Optimized GeoTIFFs",
                "license": "public",
                "extent": {
                    "spatial": {
                        "bbox": [
                            null
                        ],
                        "crs": "http://www.opengis.net/def/crs/OGC/1.3/CRS84"
                    },
                    "temporal": {
                        "interval": [
                            [
                                null,
                                null
                            ]
                        ],
                        "trs": "http://www.opengis.net/def/uom/ISO-8601/0/Gregorian"
                    }
                },
                "links": [
                    {
                        "rel": "self",
                        "type": "application/json",
                        "href": "http://127.0.0.1:5252/collections/S2"
                    },
                    {
                        "rel": "root",
                        "type": "application/json",
                        "href": "http://127.0.0.1:5252"
                    },
                    {
                        "rel": "items",
                        "type": "application/geo+json",
                        "href": "http://127.0.0.1:5252/collections/S2/items"
                    }
                ],
                "resto:info": {
                    "model": "DefaultModel",
                    "lineage": [
                        "DefaultModel"
                    ],
                    "osDescription": {
                        "ShortName": "S2 L1C",
                        "LongName": null,
                        "Description": "Sentinel-2a and Sentinel-2b imagery, processed to Level 2A (Surface Reflectance) and converted to Cloud Optimized GeoTIFFs",
                        "Tags": null,
                        "Developer": null,
                        "Contact": null,
                        "Query": null,
                        "Attribution": null
                    },
                    "owner": "218364962113585154",
                    "visibility": 100
                },
                "keywords": [],
                "providers": [],
                "assets": {},
                "summaries": {
                    "datetime": {
                        "minimum": null,
                        "maximum": null
                    }
                }
            }
        ]
    }
    

    Expected behavior Teporal and spatial extents should not be null

    Additional context I am using the docker image, version jjrom/resto:6.1.7. Database is initialized adding the SQL files to the folder /docker-entrypoint-initdb.d in a docker-compose file, running postgresql and resto. Users are also added using an SQL file (I don't think this is an issue, but it is something not executed as supposed).

    I am pretty sure this is not a bug in resto, but something I am doing wrong, but I don't find it :(

    Am, and congratulations for this project! I have used it in creodias for a long time (before knowing they used resto), and I am glad I have a chance of using it in my personal projects :)

    opened by lhcorralo 2
  • new id when feature is added

    new id when feature is added

    Describe the issue Trying to insert a new feature with 'id' set, the item is inserted using another id.

    First of all, clarifying that I am not sure if this is a bug or a misunderstood of the STAC specification by me (probably tle later).

    Expected behavior If an id is provided, it should be used for later data retrieval.

    Additional context Assuming that the collection is already created, a post request with URI http://{{STAC_SERVER}}:{{STAC_PORT}}/collections/sentinel-2-l1c/items and body:

    {
    	"type": "Feature",
    	"stacVersion": "1.0.0",
    	"stacExtensions": ["order"],
    	"id": "d6376531-3ec5-35f2-8a07-4d1f7d3116c8",
    	"collection": "sentinel-2-l1c",
    	"geometry": {
    		"type": "Polygon",
    		"coordinates": [
    			[
    				[0.23093206, 49.61959688],
    				[0.28537105, 48.63303182],
    				[1.77513104, 48.65851892],
    				[1.75053741, 49.64598097],
    				[0.23093206, 49.61959688]
    			]
    		],
    		"crs": {
    			"type": "name",
    			"properties": {
    				"name": "EPSG:4326"
    			}
    		}
    	},
    	"bbox": null,
    	"properties": {
    		"datetime": null,
    		"startDatetime": "2022-12-09T11:04:41.024Z",
    		"endDatetime": null,
    		"created": null,
    		"updated": "2022-12-09T15:19:25.714988Z",
    		"order:status": "ordered"
    	},
    	"links": [],
    	"assets": {}
    }
    

    The server returns

    {
        "status": "success",
        "message": "Inserted features",
        "inserted": 1,
        "inError": 0,
        "features": [
            {
                "featureId": "1c8338f8-a225-52cc-a018-f975f1996509",
                "productIdentifier": "d6376531-3ec5-35f2-8a07-4d1f7d3116c8",
                "facetsStored": true
            }
        ],
        "errors": []
    }
    

    The problem is due to the need of later updating the item (in particular the order:status property). I need the id for making a PUT request. Making the PUT request directly is not a solution (the item must not be overwritten in this case).

    From the result, I see that the provided id is stored as productIdentifier. I could get it from there, but I am worried about if this would be an implementation-specific solution: I could not find any details about this particular use case in the STAC API specification, and I cannot run POST requests in other servers (I tried stac-fastapi but I could not start it up).

    I have seen the code in https://github.com/jjrom/resto/blob/d249b8c66fffadf741a13198ef5c5e439066f5a4/app/resto/core/RestoModel.php#L804 but I am not a php programmer, and I cannot guess why my test is not working.

    opened by lhcorralo 2
Releases(v6.2.0)
  • v6.2.0(Dec 14, 2022)

    What's Changed

    • [BREAKING CHANGE] Upgrade resto-database to postgis/postgis:14-master by @jjrom in https://github.com/jjrom/resto/pull/316
    • [STAC] Upgrade to stac-api 1.0.0-rc.1 by @jjrom in https://github.com/jjrom/resto/pull/319
    • Add support for NOT in CQL 2 by @jjrom in https://github.com/jjrom/resto/pull/320
    • Correct issues raised from PR #319 by @jjrom in https://github.com/jjrom/resto/pull/325
    • Correct (huge) performance issue on CQL2 filter for datetime by @jjrom in https://github.com/jjrom/resto/pull/326
    • Performance optimization by @jjrom in https://github.com/jjrom/resto/pull/327
    • Typo in previous commit by @jjrom in https://github.com/jjrom/resto/pull/328
    • Add POSTGRES_MAX_PARALLEL_WORKERS by @jjrom in https://github.com/jjrom/resto/pull/333
    • Add a bunch of postgresql conf by @jjrom in https://github.com/jjrom/resto/pull/334
    • POST collection now support input extent (correct issue #331) by @jjrom in https://github.com/jjrom/resto/pull/335
    • Correct issues #336 and #337 by @jjrom in https://github.com/jjrom/resto/pull/338
    • Upgrade resto to jjrom/nginx-fpm:8.1-1 by @jjrom in https://github.com/jjrom/resto/pull/340
    • Remove unusued SAML code by @jjrom in https://github.com/jjrom/resto/pull/341

    Full Changelog: https://github.com/jjrom/resto/compare/v6.1.7...v6.2.0

    Source code(tar.gz)
    Source code(zip)
  • v6.1.7(May 5, 2022)

  • v6.1.6(May 4, 2022)

    Highlights

    • Support both resto and STAC query filter name in search (e.g. "processingLevel" and "processing:level"

    About

    resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

    Source code(tar.gz)
    Source code(zip)
  • v6.1.5(Apr 9, 2022)

  • v6.1.4(Feb 2, 2022)

    Highlights

    • Hashtag from description field are attached to a collection in facet not to '*'

    About

    resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

    Source code(tar.gz)
    Source code(zip)
  • v6.1.3(Jan 18, 2022)

    Highlights

    • [STAC] Return hashtags in summaries

    About

    resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

    Source code(tar.gz)
    Source code(zip)
  • v6.1.2(Jan 14, 2022)

    Highlights

    • [STAC] Generalize the support of ssys:targets both at the collection and feature level
    • Minor bug fixes

    About

    resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

    Source code(tar.gz)
    Source code(zip)
  • v6.1.0(Aug 31, 2021)

    Highlights

    • Corrects inconsistency to conforms to STAC v1.0.0.
    • Conforms to STAC-API v1.0.0-beta.3.
    • The default "resto" schema can now be set as an environment variable (DATABASE_SCHEMA)

    About

    resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

    Source code(tar.gz)
    Source code(zip)
  • v6.0.0(Jun 3, 2021)

    Highlights

    resto v6.0.0 (aka "Dead Dormouse") is conforms to STAC v1.0.0.

    About

    resto is a metadata catalog and a search engine dedicated to geospatialized data. Originally, it’s main purpose it to handle Earth Observation satellite imagery but it can be used to store any kind of metadata localized in time and space.

    Source code(tar.gz)
    Source code(zip)
  • v6.0.0-rc2(Aug 31, 2021)

  • v6.0.0-rc1(Jul 26, 2020)

  • v2.5.2(Dec 14, 2018)

  • v2.5.1(Aug 9, 2018)

  • v2.5.0(Jun 30, 2018)

    • Add support for update feature through PUT /collections/{collection}/{feature} endpoint
    • Add support for update feature status through PUT /collections/{collection}/{feature}/status endpoint

    [WARNING] This is a breaking change ! Existing collection features tables must be updated with the following SQL command

    ALTER TABLE _${collectionName}.features ADD COLUMN status TEXT DEFAULT 0; CREATE INDEX _${collectionName}_features_status_idx ON _sentinel1.features USING btree(status);

    Where ${collectionName} should be the name of the collection in lower case

    Source code(tar.gz)
    Source code(zip)
  • v2.4.3(Jun 30, 2018)

    Note This version corrects database deadlocks issue when performing massive parallel ingestions

    Changelog

    • Add "_useItag" query parameter to activate/unactivate iTag keywords during ingestion (activated by default)
    • Check input geometry during POST feature and returns HTTP 400 if geometry is invalid
    • Add "storeFacets" booelan option in config.php to activate/deactivate facets computation during ingestion
    • Add updateFacets.php script to compute facets outside of ingestion process through cron job
    • Update to iTag v3.1.3
    • Add unique constraint on identifier column per collection
    • Correct OpenSearch description - use "parameters:Option" instead of incorrect "parameters:Options"
    Source code(tar.gz)
    Source code(zip)
  • v2.4.2(Apr 19, 2018)

  • v2.4.1(Aug 31, 2017)

  • v2.4(Feb 19, 2017)

  • v2.3(Sep 9, 2016)

    resto v2.3

    Main upgrade - allows several and independents resto projects under the same database instance

    For other changes, see https://github.com/jjrom/resto/compare/v2.2...v2.3

    Upgrade

    To upgrade from an existing resto v2.2 database, runs the following scripts

      psql -d resto -U postgres < _install/migration/from220To230.sql
    
    Source code(tar.gz)
    Source code(zip)
  • v2.2(Apr 4, 2016)

    resto v2.2

    The main feature of resto v2.2 is the full support of geometries crossing the -180/180 degrees date line (see #203)

    For other changes, see https://github.com/jjrom/resto/compare/v2.1...v2.2

    Warning The GeoJSON output interface slightly differs from resto v2.1 : the keywords property is now an array of objects instead of an object of objects (see https://github.com/jjrom/resto/commit/eb640401278c46317ccdc118859e2a2572ec7771)

    Upgrade

    To upgrade from an existing resto v2.1 database, runs the following scripts

      psql -d resto -U postgres < _install/migration/from210To220.sql
      _install/migration/updateKeywords.php -u resto -p resto > update_keywords.sql
      psql -d resto -U postgres < update_keywords.sql
    
    Source code(tar.gz)
    Source code(zip)
  • v2.1(Dec 25, 2015)

    resto v2.1 adds new functionalities, bug fixes and security enhancement over resto v2.0. Major enhancement includes:

    • Simplification of REST routes - separation between user and administration routes
    • Rewrite of rights management
    • Support for multi-groups (i.e. a user can be in one or several groups)
    • Separate licenses from collection
    • Add licences endpoints - GET/POST/DELETE
    • Support for license per collection and per feature
    • Constrain download to grant properties per license
    • Add validation mechanism - a user shoud be validated to be able to download/view product
    • Add support for WMS proxying - i.e. WMS endpoint should be authorize like download endpoint
    • Analysis property splitted into What, When, Where, Errors blocks for better readability and handling client side
    • etc.

    IMPORTANT : to upgrade from an existing v2.0.x instance, launch the migration script i.e. psql -d resto -U postgres < _install/migration/from204To210.sql

    Source code(tar.gz)
    Source code(zip)
  • v2.1RC1(Jul 16, 2015)

    resto v2.1 Release Candidate 1 includes :

    • Simplification of routes
    • Complete rewrite of rights management
    • Support for multi-groups (i.e. a user can be in one or several groups)
    • Advances license support based on user profile
    • Support for license both per collection and per feature
    • Add validation mechanism - only validated user can download/view product
    • Add support for WMS proxying - i.e. WMS endpoint is accessible only if user has visualize right
    Source code(tar.gz)
    Source code(zip)
  • v2.0.4(Jul 2, 2015)

  • v2.0.3(Jun 22, 2015)

  • v2.0.2(Jun 4, 2015)

  • v2.0.1(May 28, 2015)

  • v2.0(May 11, 2015)

  • v2.0RC2(Apr 12, 2015)

  • v2.0RC1(Apr 12, 2015)

  • v1.0(Feb 19, 2015)

Owner
Jerome Gasperi
Earth Observation Expert | Creator of snapplanet.io
Jerome Gasperi
SphinxQL Query Builder generates SphinxQL, a SQL dialect, which is used to query the Sphinx search engine. (Composer Package)

Query Builder for SphinxQL About This is a SphinxQL Query Builder used to work with SphinxQL, a SQL dialect used with the Sphinx search engine and it'

FoolCode 318 Oct 21, 2022
A site search engine

THIS PACKAGE IS IN DEVELOPMENT, DO NOT USE IN PRODUCTION YET A site search engine This package can crawl your entire site and index it. Support us We

Spatie 219 Nov 8, 2022
Laravel search is package you can use it to make search query easy.

Laravel Search Installation First, install the package through Composer. composer require theamasoud/laravel-search or add this in your project's comp

Abdulrahman Masoud 6 Nov 2, 2022
Sphinx Search library provides SphinxQL indexing and searching features

Sphinx Search Sphinx Search library provides SphinxQL indexing and searching features. Introduction Installation Configuration (simple) Usage Search I

Ripa Club 62 Mar 14, 2022
Unmaintained: Laravel Searchy makes user driven searching easy with fuzzy search, basic string matching and more to come!

!! UNMAINTAINED !! This package is no longer maintained Please see Issue #117 Here are some links to alternatives that you may be able to use (I do no

Tom Lingham 533 Nov 25, 2022
Build and execute an Elasticsearch search query using a fluent PHP API

PACKAGE IN DEVELOPMENT, DO NOT USE YET Build and execute ElasticSearch queries using a fluent PHP API This package is a lightweight query builder for

Spatie 94 Dec 14, 2022
Search among multiple models with ElasticSearch and Laravel Scout

For PHP8 support use php8 branch For Laravel Framework < 6.0.0 use 3.x branch The package provides the perfect starting point to integrate ElasticSear

Sergey Shlyakhov 592 Dec 25, 2022
This is an open source demo of smart search feature implemented with Laravel and Selectize plugin

Laravel smart search implementation See demo at: http://demos.maxoffsky.com/shop-search/ Tutorial at: http://maxoffsky.com/code-blog/laravel-shop-tuto

Maksim Surguy 215 Sep 8, 2022
Laravel package to search through multiple Eloquent models. Supports sorting, pagination, scoped queries, eager load relationships and searching through single or multiple columns.

Laravel Cross Eloquent Search This Laravel package allows you to search through multiple Eloquent models. It supports sorting, pagination, scoped quer

Protone Media 844 Dec 25, 2022
A search package for Laravel 5.

Search Package for Laravel 5 This package provides a unified API across a variety of different full text search services. It currently supports driver

Mark Manos 354 Nov 16, 2022
A php trait to search laravel models

Searchable, a search trait for Laravel Searchable is a trait for Laravel 4.2+ and Laravel 5.0 that adds a simple search function to Eloquent Models. S

Nicolás López Jullian 2k Dec 27, 2022
Driver for Laravel Scout search package based on https://github.com/teamtnt/tntsearch

TNTSearch Driver for Laravel Scout - Laravel 5.3 - 8.0 This package makes it easy to add full text search support to your models with Laravel 5.3 to 8

TNT Studio 1k Dec 27, 2022
Kirby docs search workflow for Alfred

Kirby Docs search workflow for Alfred 4 An ultra-fast Kirby Docs search workflow for Alfred 4 Installation Download the latest version Install the wor

Adam Kiss 30 Dec 29, 2022
A TYPO3 extension that integrates the Apache Solr search server with TYPO3 CMS. dkd Internet Service GmbH is developing the extension. Community contributions are welcome. See CONTRIBUTING.md for details.

Apache Solr for TYPO3 CMS A TYPO3 extension that integrates the Apache Solr enterprise search server with TYPO3 CMS. The extension has initially been

Apache Solr for TYPO3 126 Dec 7, 2022
Support search in flarum by sonic

flarum-sonic Support search by Sonic Install Sonic following this guide Install the extension: composer require ganuonglachanh/sonic Change info in a

null 18 Dec 21, 2022
Your personal job-search assistant

JobsToMail Your personal job-search assistant About JobsToMail is an open source web application that allows users to sign up to receive emails with j

JobApis 93 Nov 13, 2022
This modules provides a Search API Backend for Elasticsearch.

Search API ElasticSearch This modules provides a Search API Backend for Elasticsearch. This module uses the official Elasticsearch PHP Client. Feature

null 1 Jan 20, 2022
Laravel Searchable - This package makes it easy to get structured search from a variety of sources

This package makes it easy to get structured search from a variety of sources. Here's an example where we search through some model

Spatie 1.1k Dec 31, 2022
Search products, categories, brands or tags with ElasticSearch

ElasticSearch for Shopaholic This plugin allows you to use ElasticSearch as search engine for Shopaholic. Benefits Easy to install, easy to use Opened

Biz-Mark 4 Feb 18, 2022