I have a flat file with each row containing records such as
AX101 12345 PQR 101
AX102 18390 XYZ 091
AX101 81249 PQR 871
My setup has a few machines connected in a network on Hyperledger with vagrant and docker.
For this test, I am running just one validating peer that should read the flat file with over million such rows and put each row as a new block. The intention is to test the speed for doing so. What could be best way to achieve this?
Approach 1 : The code could be in Golang sitting inside a smart contract
Approach 2 : A separate "reader" in another language that sends the data to the validating peer via APIs (would be slower, I think)
Once (hopefully) when all the data is on blockchain, I need to parse all entries for say AX101, speed is not a concern here but picking up all entries is.
Any pointers would be helpful!