Import & Export
TablePro supports importing data from CSV, JSON, and SQL files, as well as exporting your data in multiple formats (CSV, JSON, SQL, and Excel XLSX) for backup, migration, or analysis.
Export Data
Export Options
Export data from query results or tables:
- Run a query or open a table
- Click the Export button in the toolbar
- Choose your export format
- Select destination and options
- Click Export
Comma-Separated ValuesBest for:
- Excel and spreadsheet applications
- Data analysis tools
- Simple data exchange
Options:| Option | Description | Default |
|---|
| Delimiter | Character separating values | , (comma) |
| Quote | Character for quoting strings | " |
| Header | Include column names | Yes |
| NULL as | How to represent NULL | Empty string |
| Formula sanitization | Prevent CSV injection | Yes |
| Decimal format | Period or comma for decimals | Period |
| Line break format | LF, CRLF, or CR | LF |
Example output: JavaScript Object NotationBest for:
- Web applications
- APIs
- NoSQL databases
- Programmatic processing
Options:| Option | Description | Default |
|---|
| Pretty print | Format with indentation | Yes |
| Array format | Export as array of objects | Yes |
| Preserve as strings | Keep all values as strings (useful for ZIP codes, phone numbers) | No |
Example output:[
{
"id": 1,
"name": "John Doe",
"email": "[email protected]",
"created_at": "2024-01-15T10:30:00Z"
},
{
"id": 2,
"name": "Jane Smith",
"email": "[email protected]",
"created_at": "2024-01-16T14:45:00Z"
}
]
SQL INSERT StatementsBest for:
- Database migration
- Backup and restore
- Seeding databases
Options:| Option | Description | Default |
|---|
| Table name | Target table for INSERT | Original table |
| Include CREATE | Include CREATE TABLE | No |
| Batch size | INSERTs per statement | 500 |
Example output:INSERT INTO users (id, name, email, created_at) VALUES
(1, 'John Doe', '[email protected]', '2024-01-15 10:30:00'),
(2, 'Jane Smith', '[email protected]', '2024-01-16 14:45:00');
Excel SpreadsheetBest for:
- Sharing data with non-technical users
- Excel and spreadsheet applications
- Business reports and analysis
Options:| Option | Description | Default |
|---|
| Include headers | Column names in first row (bold) | Yes |
| NULL as empty | Convert NULL values to empty cells | Yes |
Features:
- Multi-sheet export: When exporting multiple tables, each table becomes a separate worksheet
- Numeric detection: Numbers are stored as numeric cells (not text) for proper Excel formatting
- Shared strings: String deduplication for efficient file size
- Bold headers: First row is automatically bolded when headers are included
- Large dataset support: Optimized for 175K+ rows using streaming Data buffers and background file assembly
- No dependencies: Uses a pure-Swift OOXML writer — no external libraries required
Exporting Query Results
To export the results of a specific query:
- Write and execute your query
- Click Export while results are displayed
- Only the displayed results are exported
For large exports, use pagination or LIMIT in your query to control the size of the export.
Exporting Entire Tables
To export all data from a table:
- Click on the table in the sidebar
- Click Export without applying any filters
- Or run
SELECT * FROM table_name and export
You can export multiple tables at once by selecting them in the sidebar before clicking Export. Each table can have individual options for structure (DDL) and data.
Exporting Structure Only
To export just the table structure (DDL):
- Open the table’s Structure tab
- Go to the DDL sub-tab
- Copy the CREATE TABLE statement
Or use database-specific commands:
-- MySQL
SHOW CREATE TABLE users;
-- PostgreSQL
pg_dump -s -t users database_name
Clipboard Paste (CSV/TSV)
You can paste tabular data directly from your clipboard into the data grid. TablePro automatically detects whether the clipboard content is CSV (comma-separated) or TSV (tab-separated) format.
How it works:
- Copy data from a spreadsheet, text editor, or other source
- Select a row in the data grid
- Press
Cmd+V to paste
TablePro uses an RFC 4180-compliant CSV parser that handles:
- Quoted fields with commas inside (e.g.,
"New York, NY")
- Escaped double quotes (e.g.,
"He said ""hello""")
- Multi-line values within quoted fields
- Automatic header row detection (skips if column names match)
The format is auto-detected based on content: if the data contains more tabs than commas, it’s parsed as TSV; otherwise as CSV. Data copied from Excel or Google Sheets is typically tab-separated.
Import Data
Import Options
Import data into your database:
- Click File > Import (or the Import button)
- Select your source file
- Configure import options
- Preview the data
- Click Import
Importing CSV FilesConfiguration options:| Option | Description |
|---|
| Delimiter | Field separator (comma, tab, semicolon) |
| Quote character | Character used for quoting |
| Escape character | Character for escaping quotes |
| Has header | First row contains column names |
| Skip rows | Number of rows to skip at start |
| NULL representation | String to interpret as NULL |
| Encoding | File encoding (UTF-8, Latin-1, etc.) |
Ensure your CSV encoding matches your database encoding to avoid character issues.
Importing JSON FilesSupported formats:
- Array of objects (most common)
- Newline-delimited JSON (NDJSON)
Example supported input:[
{"name": "Product A", "price": 29.99},
{"name": "Product B", "price": 49.99}
]
Configuration options:| Option | Description |
|---|
| Root path | JSON path to array (if nested) |
| Flatten nested | Expand nested objects |
Importing SQL FilesExecute SQL files containing:
- INSERT statements
- CREATE TABLE + INSERT
- Full database dumps
Options:| Option | Description |
|---|
| Stop on error | Halt import on first error |
| Transaction | Wrap in transaction |
| Encoding | File encoding |
Large SQL files are processed in batches for memory efficiency.
Import Process
Select File
Choose your CSV, JSON, or SQL file
Configure Options
Set delimiter, encoding, and other options
Map Columns
Match file columns to table columns
Preview
Review the first few rows before importing
Import
Execute the import and monitor progress
Column Mapping
When importing, you can map source columns to destination columns:
- Auto-match: Columns with matching names are automatically mapped
- Manual mapping: Drag columns or use dropdowns to map
- Skip columns: Ignore columns you don’t want to import
- Default values: Set defaults for unmapped required columns
Import Modes
| Mode | Description | Use When |
|---|
| Insert | Add new rows | Importing fresh data |
| Replace | Delete matching, insert new | Full table refresh |
| Update | Update matching, insert new | Incremental updates |
Replace mode deletes existing data. Always backup important tables first.
Data Preview
Preview Before Import
The preview feature shows:
- First 100 rows of data
- Detected column types
- Potential parsing issues
- NULL value handling
Preview Before Export
See what will be exported:
- Total row count
- Estimated file size
- Data sample
Progress and Errors
Import Progress
Large imports show progress:
- Rows processed / total rows
- Estimated time remaining
- Current batch
Error Handling
When errors occur during import:
| Option | Behavior |
|---|
| Stop on first error | Halt import immediately |
| Skip errors | Continue, log failed rows |
| Rollback on error | Revert all changes if any error |
Error details include:
- Row number
- Error message
- Problematic data
Error Log
Failed rows are logged for review:
Row 45: Data too long for column 'name' at row 1
Row 89: Duplicate entry '[email protected]' for key 'email'
Row 123: Column count doesn't match value count
Compression
TablePro supports gzip compression for both import and export:
Compressed Export
- Enable “Compress with gzip” option when exporting
- Output file gets
.gz extension
- Significantly reduces file size for large exports
Compressed Import
- TablePro automatically detects
.gz files
- Decompresses on-the-fly during import (
FileDecompressor.swift)
- No manual decompression needed
Best Practices
Exporting
-
Filter first: Export only what you need
SELECT * FROM orders WHERE status = 'completed' AND year(created_at) = 2024;
-
Consider file size: Use LIMIT for very large tables
SELECT * FROM logs ORDER BY created_at DESC LIMIT 100000;
-
Choose appropriate format:
- CSV for data analysis tools
- JSON for APIs/web
- SQL for database migration
- XLSX for sharing with non-technical users in Excel
Importing
-
Backup first: Always backup before bulk imports
-
Test with sample: Import a few rows first
-- Create test table
CREATE TABLE users_import_test LIKE users;
-- Import sample to test table
-
Validate data types: Ensure source data matches target columns
-
Use transactions: For critical imports, enable transaction mode
-
Disable indexes: For very large imports, consider:
-- Disable keys (MySQL)
ALTER TABLE users DISABLE KEYS;
-- Import data
ALTER TABLE users ENABLE KEYS;
Troubleshooting
Export Issues
“File too large”
- Use LIMIT in your query
- Export in batches
- Choose a more compact format (CSV vs JSON)
“Character encoding issues”
- Verify database encoding
- Export with UTF-8 encoding
- Check for special characters
Import Issues
“Column count mismatch”
- Check delimiter settings
- Verify CSV has correct number of columns
- Look for unquoted commas in data
“Invalid value for type”
- Check date formats match database expectations
- Verify numeric values don’t contain commas
- Check for text in numeric columns
“Duplicate key error”
- Data contains duplicates for unique columns
- Consider using REPLACE mode
- Remove duplicates before import
“Foreign key constraint”
- Import parent tables first
- Temporarily disable foreign key checks:
SET FOREIGN_KEY_CHECKS=0;
-- Import
SET FOREIGN_KEY_CHECKS=1;
Keyboard Shortcuts
| Action | Shortcut |
|---|
| Export | Cmd+Shift+E |
| Import | Cmd+Shift+I |
Next Steps