Data Science: Crunching Data with PostgreSQL and Rust.

8 min readAug 9, 2023

This story is part of my Data Science series.

In this story I want to show you a quick way how to get up and running for either pure data analysis or machine learning modeling with Rust using PostgreSQL.

If you have installed docker on your system, you quickly can spin-up an empty database by running the following command:

docker run --name ml_db -e POSTGRES_DB=ml_db \
-e POSTGRES_USER=ml_db -e POSTGRES_PASSWORD=ml_db -d -p 5432:5432 postgres

Having this running, the next thing we need are some data. A common way in data science/analysis is to get data handed over in form of a csv file.

As an example let us consider the following stock data from here.

The downloaded archive contains a bundle of files with names of the form YYYY_Global_Markets_Data. We can use the following Rust code to write all the data in one common file named data.csv:

use std::{
io::{BufRead, BufReader, BufWriter, Write},

fn main() -> std::io::Result<()> {
let mut data_file_writer = BufWriter::new(File::create("data/data.csv")?);
let header = "Ticker,Date,Open,High,Low,Close,Adj Close,Volume";

for year in 2008..2024 {
let file_name = format!("data/{}_Global_Markets_Data.csv", year);
let reader = BufReader::new(File::open(file_name)?);
let mut line_iter = reader.lines().skip(1);
while let Some(Ok(line)) = {

Next we create a table on PostgreSQL which will hold the data:

CREATE TABLE stock_prices(
ticker VARCHAR(30),
date DATE,
open REAL,
high REAL,
low REAL,
close REAL,
"Adj Close" REAL,
volume REAL




I am a Software Developer - Rust, Java, Python, TypeScript, SQL - with strong interest doing research in pure and applied Mathematics.