Skip to main content

Import & Export

TablePro supports importing data from CSV, JSON, and SQL files, as well as exporting your data in multiple formats (CSV, JSON, SQL, and Excel XLSX) for backup, migration, or analysis.

Export Data

Export Options

Export data from query results or tables:
  1. Run a query or open a table
  2. Click the Export button in the toolbar
  3. Choose your export format
  4. Select destination and options
  5. Click Export
Export dialog

Export Formats

Comma-Separated ValuesBest for:
  • Excel and spreadsheet applications
  • Data analysis tools
  • Simple data exchange
Options:
OptionDescriptionDefault
DelimiterCharacter separating values, (comma)
QuoteCharacter for quoting strings"
HeaderInclude column namesYes
NULL asHow to represent NULLEmpty string
Formula sanitizationPrevent CSV injectionYes
Decimal formatPeriod or comma for decimalsPeriod
Line break formatLF, CRLF, or CRLF
Example output:
id,name,email,created_at
1,"John Doe",[email protected],2024-01-15 10:30:00
2,"Jane Smith",[email protected],2024-01-16 14:45:00

Exporting Query Results

To export the results of a specific query:
  1. Write and execute your query
  2. Click Export while results are displayed
  3. Only the displayed results are exported
For large exports, use pagination or LIMIT in your query to control the size of the export.

Exporting Entire Tables

To export all data from a table:
  1. Click on the table in the sidebar
  2. Click Export without applying any filters
  3. Or run SELECT * FROM table_name and export
You can export multiple tables at once by selecting them in the sidebar before clicking Export. Each table can have individual options for structure (DDL) and data.

Exporting Structure Only

To export just the table structure (DDL):
  1. Open the table’s Structure tab
  2. Go to the DDL sub-tab
  3. Copy the CREATE TABLE statement
Or use database-specific commands:
-- MySQL
SHOW CREATE TABLE users;

-- PostgreSQL
pg_dump -s -t users database_name

Clipboard Paste (CSV/TSV)

You can paste tabular data directly from your clipboard into the data grid. TablePro automatically detects whether the clipboard content is CSV (comma-separated) or TSV (tab-separated) format. How it works:
  1. Copy data from a spreadsheet, text editor, or other source
  2. Select a row in the data grid
  3. Press Cmd+V to paste
TablePro uses an RFC 4180-compliant CSV parser that handles:
  • Quoted fields with commas inside (e.g., "New York, NY")
  • Escaped double quotes (e.g., "He said ""hello""")
  • Multi-line values within quoted fields
  • Automatic header row detection (skips if column names match)
The format is auto-detected based on content: if the data contains more tabs than commas, it’s parsed as TSV; otherwise as CSV. Data copied from Excel or Google Sheets is typically tab-separated.

Import Data

Import Options

Import data into your database:
  1. Click File > Import (or the Import button)
  2. Select your source file
  3. Configure import options
  4. Preview the data
  5. Click Import
Import dialog

Import Formats

Importing CSV FilesConfiguration options:
OptionDescription
DelimiterField separator (comma, tab, semicolon)
Quote characterCharacter used for quoting
Escape characterCharacter for escaping quotes
Has headerFirst row contains column names
Skip rowsNumber of rows to skip at start
NULL representationString to interpret as NULL
EncodingFile encoding (UTF-8, Latin-1, etc.)
Ensure your CSV encoding matches your database encoding to avoid character issues.

Import Process

1

Select File

Choose your CSV, JSON, or SQL file
2

Configure Options

Set delimiter, encoding, and other options
3

Map Columns

Match file columns to table columns
4

Preview

Review the first few rows before importing
5

Import

Execute the import and monitor progress

Column Mapping

When importing, you can map source columns to destination columns:
Column mapping
  • Auto-match: Columns with matching names are automatically mapped
  • Manual mapping: Drag columns or use dropdowns to map
  • Skip columns: Ignore columns you don’t want to import
  • Default values: Set defaults for unmapped required columns

Import Modes

ModeDescriptionUse When
InsertAdd new rowsImporting fresh data
ReplaceDelete matching, insert newFull table refresh
UpdateUpdate matching, insert newIncremental updates
Replace mode deletes existing data. Always backup important tables first.

Data Preview

Preview Before Import

The preview feature shows:
  • First 100 rows of data
  • Detected column types
  • Potential parsing issues
  • NULL value handling
Data preview

Preview Before Export

See what will be exported:
  • Total row count
  • Estimated file size
  • Data sample

Progress and Errors

Import Progress

Large imports show progress:
  • Rows processed / total rows
  • Estimated time remaining
  • Current batch
Import progress

Error Handling

When errors occur during import:
OptionBehavior
Stop on first errorHalt import immediately
Skip errorsContinue, log failed rows
Rollback on errorRevert all changes if any error
Error details include:
  • Row number
  • Error message
  • Problematic data

Error Log

Failed rows are logged for review:
Row 45: Data too long for column 'name' at row 1
Row 89: Duplicate entry '[email protected]' for key 'email'
Row 123: Column count doesn't match value count

Compression

TablePro supports gzip compression for both import and export:

Compressed Export

  • Enable “Compress with gzip” option when exporting
  • Output file gets .gz extension
  • Significantly reduces file size for large exports

Compressed Import

  • TablePro automatically detects .gz files
  • Decompresses on-the-fly during import (FileDecompressor.swift)
  • No manual decompression needed

Best Practices

Exporting

  1. Filter first: Export only what you need
    SELECT * FROM orders WHERE status = 'completed' AND year(created_at) = 2024;
    
  2. Consider file size: Use LIMIT for very large tables
    SELECT * FROM logs ORDER BY created_at DESC LIMIT 100000;
    
  3. Choose appropriate format:
    • CSV for data analysis tools
    • JSON for APIs/web
    • SQL for database migration
    • XLSX for sharing with non-technical users in Excel

Importing

  1. Backup first: Always backup before bulk imports
  2. Test with sample: Import a few rows first
    -- Create test table
    CREATE TABLE users_import_test LIKE users;
    -- Import sample to test table
    
  3. Validate data types: Ensure source data matches target columns
  4. Use transactions: For critical imports, enable transaction mode
  5. Disable indexes: For very large imports, consider:
    -- Disable keys (MySQL)
    ALTER TABLE users DISABLE KEYS;
    -- Import data
    ALTER TABLE users ENABLE KEYS;
    

Troubleshooting

Export Issues

“File too large”
  • Use LIMIT in your query
  • Export in batches
  • Choose a more compact format (CSV vs JSON)
“Character encoding issues”
  • Verify database encoding
  • Export with UTF-8 encoding
  • Check for special characters

Import Issues

“Column count mismatch”
  • Check delimiter settings
  • Verify CSV has correct number of columns
  • Look for unquoted commas in data
“Invalid value for type”
  • Check date formats match database expectations
  • Verify numeric values don’t contain commas
  • Check for text in numeric columns
“Duplicate key error”
  • Data contains duplicates for unique columns
  • Consider using REPLACE mode
  • Remove duplicates before import
“Foreign key constraint”
  • Import parent tables first
  • Temporarily disable foreign key checks:
    SET FOREIGN_KEY_CHECKS=0;
    -- Import
    SET FOREIGN_KEY_CHECKS=1;
    

Keyboard Shortcuts

ActionShortcut
ExportCmd+Shift+E
ImportCmd+Shift+I

Next Steps