Random Access Memory continues as a top choice with its capacity going higher from decades ago. In the early ages of computing, memories in kilobytes were normal although today the unit are gigabytes, so this means a multiplication factor of approximately a million. The architecture did not specially change as they are transistors and capacitors. The transistor, a semiconductor, allows to set the capacitor or read its value. The memory is composed by arrays of this couple arranged in such a way that it's possible to access individual bits and change them
Are they setup in vertical or horizontal like columns and rows? Yes, they are! In this matter occurs similar to displays or optical sensors. The objective is to create columns and rows and setup the data. You could do it with four by four, with relative simplicity or four billion by four billion. The principle is the same nonetheless some challenges will appear for sure in a certain point.
A problem of this memory is how the capacitor experiences a leak thus without constant refreshing the data would be lost. It creates a slowdown and more energy consumption. A good reason to believe that this memory might be superseded by another architecture.