diff options
author | neodarz <neodarz@neodarz.net> | 2019-01-16 22:31:43 +0100 |
---|---|---|
committer | neodarz <neodarz@neodarz.net> | 2019-01-16 22:31:43 +0100 |
commit | 43b938695328978b5881fe15772fdad96a870e33 (patch) | |
tree | ca75f2fc7311e85cd7fbdd8c5b3fa3482a4bbf4e | |
parent | a32379718277388baa692c6a2a9e3f3b3bbcb902 (diff) | |
download | khanindexer-43b938695328978b5881fe15772fdad96a870e33.tar.xz khanindexer-43b938695328978b5881fe15772fdad96a870e33.zip |
Update some information on how to use this
-rw-r--r-- | README.md | 21 |
1 files changed, 16 insertions, 5 deletions
@@ -46,10 +46,10 @@ the `sphinx_search.conf` file. # Crawling For now there is an example spider with neodarz website. -For testing it just run: +For launch all the crawler use the following command: ``` -python app.py +python app.py crawl ``` # Indexing @@ -74,6 +74,8 @@ The command for indexing is: indexer --config sphinx_search.conf --all ``` +Don't forget to launch the crawling command before this ;) + # Searching Before you can make search, you must lauch the search server @@ -81,9 +83,18 @@ Before you can make search, you must lauch the search server searchd -c sphinx_search.conf ``` -Example search command: +## Enjoy + +You can now launch the server! + +``` +python app.py +``` + +For start searching send `GET` request to the following adresse (without `<` and +`>`): ``` -curl -X POST '127.0.0.1:8080/search' -d 'index=datas&match=@content livet&select=id&limit=5' --output - +127.0.0.1:5000/?search=<search terms> ``` -You can also use the search function of the sphinx module of this project. +Resultat are in json format. |