I have a high level understanding of what O(n) means in space. It means something along the lines of that for an algorithm with input n, the additional storage in memory allocated by this algorithm will increase proportionally to n.
So if you had an algorithm that took as input a number n, and created an array of size 2n and filled it will all 0s, the time complexity would be O(n) and the space complexity would be O(n) because you are creating an array(additional storage) that is relative to the size of the input. Is this understanding correct?
Secondly, how bad is a space complexity of O(n)? The popular sorting algorithms like quick sort have worst case space complexity of O(n), so for sorting arbitrarily long data, is it possible that the O(n) space complexity could have dire effects? And if so, is there any intuition as to why or how?
N in big O notation usually means the size of the input, not the value passed in to the algorithm.
Space complexity of O(n) means that for each input element there may be up to a fixed number of k bytes allocated, i.e. the amount of memory needed to run the algorithm grows no faster than linearly at k*N.
For example, if a sorting algorithm allocates a temporary array of N/2 elements, the algorithm is said to have an O(n) space complexity.
It is not possible to say if it is good or bad without some context. In many cases the space complexity of O(N) is acceptable, but there are exceptions to the rule. Sometimes you increase memory complexity to reduce time complexity (i.e. pay with memory for a significant speedup). This is almost universally considered a good tradeoff.