I would like to know how R handle Memory, for large data.
please provide help in understanding how data is handled if 5 gb data is read.
Is whole data set reside in memory during processing? OR
Small chunks of dataset is fetched from hard disk as and when required?
Hi, thanks for your question!
In this section of the R4DS book, there is some mention on Big Data to consider: 1 Introduction | R for Data Science
In the Memory chapter of Advance R book there are more detail: Memory usage · Advanced R.
Also, there are packages to handle larger-than-memory data as arrow: Arrow R Package • Arrow R Package
Hope some of this pointer helps you.