Which of these ways is better:
- to perform 10000 small queries with filtration conditions to table linked with many other tables (we need data from linked tables too (doing .include() in LINQ expression)) or;
- to get the data from the table and linked tables piece by piece without any conditions (in a loop, in 10 iterations => 10 queries to db), load them in a collection and then during every iteration perform all analytical work by LINQ with collection?
I think that second way will be faster but what about memory and how to solve this problem according to the rules of clean architecture?
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…