Does anyone have this issue where Google Colab just crashes when I try to graph a large chunk of data? Is there any way to avoid this issue? More specifically, the VM's RAM just runs out when I try to graph things.
The problem is with the .todense() function. If you happen to follow the answer on StackOverflow and try to do .todense(), the dataset would be too large for the function which results in an error.
Thanks! Saved me before I even ran into the problem!
Thanks! Had the same issue