PatentInspector is an open-source tool that analyzes patent data. Its backend is implemented in Python using the Django framework, and its frontend is based on Vue. The tool is designed to be user-friendly and easily extensible, making it accessible to a wide range of users.
You can use PatentInspector in the following URL, please note that our resources are limited: https://patentinspector.csd.auth.gr/
First you have to create an account and login and then you can use the tool.
Easily generate reports by filtering the patents based on comprehensive filters.
Your reports are securely saved, accessible whenever you need them, ensuring you can track patent trends effortlessly.
Quickly grasp the patent landscape with statistical metrics such as the average number of claims.
Dive into how variables change over time with intuitive time series plots, helping you identify trends and patterns.
Explore the patent landscape from different angles with interactive plots, including bar charts and heat maps, to uncover hidden insights.
Gain an understanding of the main topics within the patent landscape and their significance through visually engaging bar charts and scatter plots in the thematic analysis section.
Visualize the citation network of patents in your reports and discover the most cited patents.
Each chart in PatentInspector comes with a corresponding table that can be effortlessly copied with a single click and seamlessly pasted into your spreadsheet of choice.
PatentInspector empowers you to export filtered patents in Excel format, providing you with the freedom to conduct further in-depth analyses on your terms.
It should take about 1-2 hours to download the dump and insert it to the database. The dump is ~4GB and the database is ~40GB.
Docker will create some dummy certificates for https, if you want to use your own just put the ssl key as server.key
and the ssl certificate as server.crt
in the index/project directory.
Install Docker if you haven't already. Then run the following commands:
cp ./backend/.env.example ./backend/.env # defaults are ok for personal use
sudo docker compose up
- Install
python
,pip
andvirtualenv
. - Install
postgresql
and create an admin user. - Create a
.env
file based on the.env.example
file - Create a database named to whatever
POSTGRES_DB
is set in your.env
file and install thepostgis
andpg_trgm
extensions to it. Most likely postgis has system dependencies that you need to install. - Create a virtualenv and activate it
- Install the
requirements.txt
file using pip. -
- Run the
uspto
andindex
management commands to download data from the USPTO website, pre-process it and insert it to the database and index it (about 10 hours) - Or run the
load_database
management command to load the indexed database from a file stored in a remote server (about 1 hour)
- Run the
- Run the server with
python manage.py runserver
- Install
npm
- Run
npm install
- Run
npm run dev
- Visit the URL that is printed in the console (most likely http://localhost:5173/)
Remember if you get stuck you can always take a look at the docker files to see how we run the project in production. You can even open an issue if you want to ask something or suggest a change.
The uspto
management command is used to download data from the USPTO website and insert into the database. It can be used as follows:
python manage.py uspto
The index
is a management command that is used to index the database. It can be used as follows:
python manage.py index
The dump_database
command is used to dump the database into a file and upload it to Google Drive (if credentials are provided).
To provide credentials you need to create a service account key in Google Cloud and put the downloaded json in the backend
directory with the name service-secrets.json
. The use of command is as simple as:
python manage.py dump_database
The load_database
command is used to load the database from a file stored in a remote server. Links for download are configured in the .env
file. Namely the DOWNLOAD_BACKUP_URL1
variable is used to specify the URL for a Google Drive file, and the DOWNLOAD_BACKUP_URL2
variable is used to specify a backup URL which can be anything else. The .env.example
file contains valid URLs to download a database dump from Google Drive or Azure Blob Storage. To use the command you just have to run:
python manage.py load_database