Quick Start

The Semantic Search is a dependent component of the Ontotext Semantic Objects. This is why you need to have Semantic Objects running in order to use it.

Docker

The Semantic Search uses Docker and Docker containers. All of them are published in the Ontotext Docker hub. You will need to install the Docker daemon on the machine on which you will be running the service.

Follow the Docker installation guide.

Docker Compose

The Semantic Search can be run using a Docker Compose configuration on your developer machine.

You will need to install Docker Compose on the machine on which you wish to run the service.

Follow the Docker Compose installation guide.

To deploy the Semantic Search, you need to download this docker-compose.yaml example that starts the Semantic Search along with Semantic Objects services (Semantic Objects, Workbench, and GraphDB), Elasticsearch, and Kibana.

After deploying the Semantic Objects, the Semantic Search will be available at http://localhost:9980.

To configure the Semantic Search, use the declarative Semantic Object schema and its configuration options.

The Ontotext Semantic Search is available under the commercial time-based license of the Semantic Objects. To obtain an evaluation license, please contact the Ontotext team and see the documentation on Setting up Licenses.

Once you have obtained the license, you can either:

  • Move on to the next section and start the Semantic Search and then follow the instructions in the Semantic Objects documentation on how to Set up a license from the Workbench using http://localhost:9993/, or
  • Update the Semantic Objects volumes configuration in the Docker Compose file following the instructions in the Semantic Objects documentation for Setting up a license through a file.

cURL

cURL is required only if you intend to use system console to create repositories and load data into GraphDB, manage a Semantic Objects schema, as well as perform GraphQL queries. All of these actions can be executed using the Workbench.

Please follow the cURL installation guide.

Start the Service

Start the docker containers using:

docker-compose -f /path/to/your/docker-compose.yaml up -d

If you have problems with old containers, consider using the --force-recreate flag, e.g., docker-compose -f /path/to/your/docker-compose.yaml up -d --force-recreate.

You can check the running containers using the following docker command:

docker ps

It should include Semantic Search, Semantic Objects, Workbench, GraphDB, Elasticsearch, and Kibana.

Initialize GraphDB

  1. If your GraphDB distribution is an Enterprise edition like in the example above, you will need to provide a license. You can do it through the GraphDB Workbench using http://localhost:9998/. See the official documentation on Setting up GraphDB Licenses.

    Hint

    Alternatively, the license can be provisioned by mounting it in the Docker container in the /opt/graphdb/dist/conf/graphdb.license path.

  2. Once the license is provisioned, you need to create a repository. First, download the repo.ttl RDF dataset, which contains configurations for a repository named soaas.

  3. Upload it via the GraphDB Workbench following the instructions for Creating a repository.

    Alternatively, you can also upload it using the following cURL command:

    curl -X POST -H "Content-Type: multipart/form-data" -F "config=@repo.ttl" http://localhost:9998/rest/repositories/
    

Hint

A repo can be automatically initialized by GraphDB if repo.ttl is mounted in the GraphDB Docker container under the /opt/graphdb/dist/data/repositories/soaas/config.ttl path.

Put Star Wars Data into GraphDB

  1. Download the starwars-data.ttl RDF dataset.

    It describes Star Wars, films, starships, characters, etc. You can find more details about the dataset here.

  2. Upload it via the GraphDB Workbench following the instructions for Loading data from a local file.

    Alternatively, you can also upload it using the following cURL command:

    curl -X POST -H "Content-Type:application/x-turtle" -T starwars-data.ttl http://localhost:9998/repositories/soaas/statements
    

Define Star Wars Semantic Objects schema

  1. Download the Semantic Objects schema schema.yaml.

    It describes the Semantic Object mapping to Star Wars RDF, and in what way these Semantic Objects should be indexed to Elasticsearch. Then it is used to generate a GraphQL schema for querying the Star Wars data that is indexed to Elasticsearch.

  2. Load the Semantic Objects schema from the Workbench on http://localhost:9993/ following the instructions for Uploading Schema Wizard.

    Alternatively, you can also load it using the following cURL command:

    curl -X POST -H "Content-Type: text/yaml" -H "Accept: application/ld+json" -T schema.yaml -H "X-Request-ID: GettingStartedTx01" http://localhost:9995/soml
    
  3. Activate (bind) this schema to Semantic Objects and to Semantic Search in order to start indexing to Elasticsearch and generate a GraphQL schema. You can do this from the Workbench by following the Upload Schema Wizard steps or from the Manage Schema page.

    Alternatively, you can also activate it using the following cURL commands:

    curl -X PUT -H "X-Request-ID: GettingStartedTx02" http://localhost:9995/soml/swapi/soaas
    curl -X PUT -H "X-Request-ID: GettingStartedTx02" http://localhost:9980/soml/swapi/search
    

Run a Star Wars GraphQL Query

  • GraphQL Query to Semantic Search

    As you can see in the Semantic Objects schema that you downloaded and activated in the Semantic services, there is a configuration for the Wookiee Semantic Object that creates the otp-wookiee index in Elasticsearch, and indexes in it all the Wookiee objects that were loaded in GraphDB:

    Wookiee:
      search: {index: yes}
    

    For more info how to configure the Semantic Search using the Semantic Objects schema, see Semantic Object Modeling.

    The following Semantic Search GraphQL query gets all Wookies. Data is extracted from the Elasticsearch:

    Loading...
    https://swapi-platform.ontotext.com/semantic-search/graphql
    eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6ImJNODctcWV3SldXTWRMRjIzNUtXUmluTlp4TDA0ZngzIn0.eyJhdWQiOiI2NzJjYTdiMy1jMzcyLTRkZjMtODJkOC05YTFhMGQ3ZDY4YzEiLCJleHAiOjE2OTQ3MTU5MjksImlhdCI6MTY2MzE3NTkyOSwiaXNzIjoic3dhcGktcGxhdGZvcm0ub250b3RleHQuY29tIiwic3ViIjoiZTlmOWQzNGQtZThmMS00ODM1LTlkMzAtOWRjNmU5YjQ4ZmMzIiwianRpIjoiYTY4NzkwNTMtNWUyYy00YjkzLWFiZGItNzEzZGRmOGJiMTlmIiwiYXV0aGVudGljYXRpb25UeXBlIjoiUEFTU1dPUkQiLCJlbWFpbCI6InJlYWRvbmx5dXNlckBleGFtcGxlLmNvbSIsImVtYWlsX3ZlcmlmaWVkIjp0cnVlLCJwcmVmZXJyZWRfdXNlcm5hbWUiOiJSZWFkT25seVVzZXIiLCJhcHBsaWNhdGlvbklkIjoiNjcyY2E3YjMtYzM3Mi00ZGYzLTgyZDgtOWExYTBkN2Q2OGMxIiwicm9sZXMiOlsiUmVhZE9ubHkiXSwiZGF0YSI6e319.imG6XclIEw8A8dhbqRqdOVWv0BqBqshAuXx_DQrEJd8
    true
    query getWookiees { wookiee_search { max_score hits { score wookiee { id name rdfs_label {value lang} } } } }

    You can execute the query by accessing the Workbench GraphiQL Playground on http://localhost:9993/graphql and selecting Semantic Search service endpoint from the dropdown.

    An equivalent cURL request looks like this:

    curl --location --request POST 'http://localhost:9980/graphql' \
        --header 'accept: application/json' \
        --header 'Content-Type: application/json' \
        --data-raw '{"query":"query getWookiees { wookiee_search { max_score hits { score wookiee { id name rdfs_label {value lang}}}}}","variables":{}}'
    

    For more complex GraphQL queries showing the wide variety of available functionalities, see the Semantic Search GraphQL Query tutorial.

  • GraphQL Query to Semantic Objects

    You can execute a similar GraphQL query against the Semantic Objects, where we will again get all the Wookiees, but now the data is extracted from GraphDB and not from the Elasticsearch indexes.

    You can execute the query by accessing the Workbench GraphiQL Playground on http://localhost:9993/graphql and selecting Semantic Objects service endpoint from the dropdown.

    Loading...
    https://swapi-platform.ontotext.com/graphql
    eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6ImJNODctcWV3SldXTWRMRjIzNUtXUmluTlp4TDA0ZngzIn0.eyJhdWQiOiI2NzJjYTdiMy1jMzcyLTRkZjMtODJkOC05YTFhMGQ3ZDY4YzEiLCJleHAiOjE2OTQ3MTU5MjksImlhdCI6MTY2MzE3NTkyOSwiaXNzIjoic3dhcGktcGxhdGZvcm0ub250b3RleHQuY29tIiwic3ViIjoiZTlmOWQzNGQtZThmMS00ODM1LTlkMzAtOWRjNmU5YjQ4ZmMzIiwianRpIjoiYTY4NzkwNTMtNWUyYy00YjkzLWFiZGItNzEzZGRmOGJiMTlmIiwiYXV0aGVudGljYXRpb25UeXBlIjoiUEFTU1dPUkQiLCJlbWFpbCI6InJlYWRvbmx5dXNlckBleGFtcGxlLmNvbSIsImVtYWlsX3ZlcmlmaWVkIjp0cnVlLCJwcmVmZXJyZWRfdXNlcm5hbWUiOiJSZWFkT25seVVzZXIiLCJhcHBsaWNhdGlvbklkIjoiNjcyY2E3YjMtYzM3Mi00ZGYzLTgyZDgtOWExYTBkN2Q2OGMxIiwicm9sZXMiOlsiUmVhZE9ubHkiXSwiZGF0YSI6e319.imG6XclIEw8A8dhbqRqdOVWv0BqBqshAuXx_DQrEJd8
    true
    query Wookiees { wookiee{ id name } }

    An equivalent cURL request to the Semantic Objects looks like this:

    curl --location --request POST 'http://localhost:9995/graphql' \
        --header 'accept: application/json' \
        --header 'Content-Type: application/json' \
        --data-raw '{"query":"query Wookiees { wookiee { id name }}","variables":{}}'
    

Stop the Service

Stop and remove all Platform Docker containers using:

docker-compose -f /path/to/your/docker-compose.yaml down

To remove the volume data as well, use:

docker-compose -f /path/to/your/docker-compose.yaml down --volumes