Coefficient's PostgreSQL integration lets you easily connect your PostgreSQL database server and import your data to Google Sheets/Excel. Users can import their data by selecting Tables/Columns or write their own query. GPT CoPilot can also be leveraged if users need help creating a custom query for their Coefficient imports! 😎
Import using a Custom SQL Query
Schedule your Import, Snapshots, and Add Automations
FAQs for PostgreSQL Integration
Connecting to PostgreSQL
When you begin a PostgreSQL import for the first time, you will need to go through a few steps to connect PostgreSQL as a data source for Coefficient.
ℹ️ NOTE: Coefficient will need the following information: Host, Database Name, Username, and Password. (The default PostgreSQL port is 5432)
1. Open the Coefficient Sidebar and click the Menu. |
2. Select “Connected Sources”. |
3. Select “Add Connection” at the bottom and then “Connect” to PostgreSQL. |
4. Enter the required fields (Host, Database name, Username, Password, and Port). |
5. If your database is behind a firewall, you will need to whitelist (ALL 3) Coefficient's server IP addresses. (34.217.184.131, 44.234.233.60, 52.32.132.51). Click "Connect" when done. |
6. You will then be presented with the option to share this connection with other members of your team who also use Coefficient. Your credentials will NOT be shared with your team. 🎉 |
Import from PostgreSQL
There are a few ways to import data using Coefficient from PostgreSQL, Importing from Tables & Columns, Importing from a Custom SQL Query, and Import from GPT SQL Builder.
Importing from Tables and columns allows you to create imports without having to write SQL. Using a Custom SQL Query gives you additional flexibility in the data that you are importing into Coefficient. And lastly, you can now prompt the Coefficient's AI to automatically build the SQL query for you. 🤯
Import from Tables & Columns
1. From the Sidebar select “Import from…”. |
2. Select “PostgreSQL” from the list. |
3. Choose "From Tables & Columns". |
4. The Import Preview window opens showing all the table schemas from your PostgreSQL database. Select the table for your import. (eg. ”coeff.actor”)
5. Once the table is selected, the fields within that table will appear in a list on the left side of the Import Preview window. Select the fields you want to include in your import by checking/unchecking the corresponding boxes.
ℹ️ NOTE: The Import Preview shows only a sample of your data (50 rows). This sample data will be updated if there are any changes to the import's criteria.
6. Customize your import by adding filters, sorts, limits, or even grouping the data into a cloud pivot table. Then "Import" when done.
7. Congratulations on your first PostgreSQL import using Tables & Columns! 🎉
Import from a Custom SQL Query
1. From the Sidebar select “Import from…”. |
2. Select “PostgreSQL” from the list. |
3. Select "Custom SQL Query". |
4. The Import Preview window opens allowing you to enter your custom SQL query in the black text box shown below. For further flexibility, you can use Coefficient’s SQL Parameters feature to point a value to a specific cell/range of cells for your query.
ℹ️ NOTE: Whenever you make changes to your query, you need to click "Refresh Preview" to update the sample data shown in the preview window.
5. When you click “Import” you will be prompted to give your import a name. The name MUST be UNIQUE as it will also be the name of the tab in your Google Sheets/Excel when imported. (You can always change the name later if needed).
6. Congratulations on your successful PostgreSQL Custom SQL import with Coefficient! 🎉
Import from GPT SQL Builder
1. From the Sidebar select “Import from…”. |
2. Select “PostgreSQL” from the list. |
3. Select "GPT SQL Builder". |
4. Enter your prompt/query in the "Describe what you want to query" box. (Example: "Show all the recently updated actors for 2022") When done, click "Generate SQL".
ℹ️ PRO TIP: Be specific when entering your prompts so that the AI can easily understand your requirements and provide more accurate results.
5. The SQL Builder will automatically generate and write the SQL query for you in the blue text box.
ℹ️ NOTE: Click "Refresh Preview" to display a sample of your data results (only 50 rows are shown) or to update the results of the preview if you make any changes to the query.
6. You will be prompted to give your import a name. Remember it MUST be UNIQUE as it will also be the name of the tab in your Google Sheets/Excel when imported. (You can always change the name later if needed).
7. Congratulations on your PostgreSQL import using Coefficient's GPT SQL Builder! 🎉
ℹ️ See GPT SQL Builder to learn more!
Schedule your Import, Snapshots, and Add Automations
Once you have pulled your data into your spreadsheet using Coefficient, you can set up the following:
|
|
FAQs for PostgreSQL Integration
I keep getting an error when I try to connect my PostgreSQL server using Coefficient. What is wrong?
There are a few things to try in this instance:
- Make sure that you have the correct PostgreSQL Hostname, Database Name, Username, Password, and Port for your PostgreSQL instance.
- If your database is behind a firewall you will need to whitelist our IP Addresses.
- 34.217.184.131
- 44.234.233.60
- 52.32.132.51
- Ensure that your PostgreSQL server and port are set to accept remote connections and are not just listening for incoming connections from localhost. Make sure that your server is accessible from the outside (internet) and not only hosted from your local PC/machine. This may require that you reach out to your PostgreSQL Server Admin to update some of your database settings.
How long is Coefficient connected to my PostgreSQL instance?
When Coefficient needs to run a query, we establish a connection to your database, run the query on your behalf, and terminate the connection once the query completes.
Why is the initial connection to PostgreSQL so slow?
When initially connecting to PostgreSQL, it may take a few minutes for Coefficient to fetch your database schema: table/view definitions, column names, etc. We cache this information so setting up subsequent PostgreSQL imports should feel much snappier.
I added a table (or column) in my PostgreSQL database; why is it not showing up in Coefficient?
To deliver a snappy experience when you set up imports from PostgreSQL, we cache your database schema for up to 24 hours. If you recently changed your database schema (e.g., added a table/column, renamed a table/column, etc.), and you don't see the change reflected in Coefficient, you can force a schema reload:
- Open the Coefficient sidebar in Google Sheets/Excel.
- Click on the ≣ menu in the top right of the sidebar, then click on “Connected Sources”.
- Click on your PostgreSQL connection to see its Connection Settings page
- Click on the ︙button near the top right, choose “Reload Schema”, and click “Reload” on the confirmation dialog.
My custom SQL script seems to run longer than expected and sometimes, I see a "SQL Error - canceling statement due to statement timeout" error when I refresh my import, what should I do?
The error message you're seeing indicates that the SQL query you're trying to execute is being canceled due to a statement timeout. This means that the query is taking too long to execute, and your database server is configured to cancel any query that exceeds a certain execution time threshold.
Here are some steps you can take to understand and fix this issue:
- Examine the Execution Plan: Use the EXPLAIN command to get the execution plan for your query. This will show you where the query might be inefficient, such as performing full table scans or using nested loops that could be optimized. (Click here to learn more about the EXPLAIN command with PostgreSQL).
- Optimize the Query: Look for ways to make the query more efficient. This could involve adding indexes to the columns used in the WHERE clause and the ILIKE conditions, rewriting the query to reduce complexity, or breaking it into smaller parts.
- Reduce the Dataset: If possible, limit the scope of the query. For example, if you're querying a large date range or a large number of rows, see if you can reduce that range with LIMIT.
- Increase the Statement Timeout (not recommended): If you have control over the database server settings, you can increase the statement timeout value. This is a temporary solution and may not be ideal if the query is inherently inefficient.