From 48d5645d13785cce7d3257c2b847995f8f98d1cf Mon Sep 17 00:00:00 2001 From: cubixle Date: Tue, 7 May 2024 09:06:02 +0100 Subject: [PATCH] add codeblock formatting --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 971f4c7..608f413 100644 --- a/README.md +++ b/README.md @@ -8,6 +8,7 @@ The dataset will be N randomly generated entires with 3 columns. Using the Hyperfine benchmarking tool, I ran the built binary for the CSV file and the Parquet file and as you can see from the results below the parquet is much faster writing 100,000 rows. +```sh $ ./tests.sh Benchmark 1: ./app -type csv -amount 100000 Time (mean ± σ): 244.2 ms ± 8.7 ms [User: 22.7 ms, System: 197.6 ms] @@ -16,3 +17,4 @@ Benchmark 1: ./app -type csv -amount 100000 Benchmark 1: ./app -type parquet -amount 100000 Time (mean ± σ): 38.3 ms ± 3.8 ms [User: 62.0 ms, System: 9.9 ms] Range (min … max): 26.4 ms … 44.4 ms 69 runs +```