Lists have no special operational treatment in Haskell. They are defined just like:
data List a = Nil | Cons a (List a)
Just with some special notation: [a] for List a, [] for Nil and (:) for Cons. If you defined the same and redefined all the operations, you would get the exact same performance.
Thus, Haskell lists are singly-linked. Because of laziness, they are often used as iterators. sum [1..n] runs in constant space, because the unused prefixes of this list are garbage collected as the sum progresses, and the tails aren’t generated until they are needed.
As for #4: all values in Haskell are memoized, with the exception that functions do not keep a memo table for their arguments. So when you define fib like you did, the results will be cached and the nth fibonacci number will be accessed in O(n) time. However, if you defined it in this apparently equivalent way:
-- Simulate infinite lists as functions from Integer
type List a = Int -> a
cons :: a -> List a -> List a
cons x xs n | n == 0 = x
| otherwise = xs (n-1)
tailF :: List a -> List a
tailF xs n = xs (n+1)
fib :: List Integer
fib = 1 `cons` (1 `cons` (\n -> fib n + tailF fib n))
(Take a moment to note the similarity to your definition)
Then the results are not shared and the nth fibonacci number will be accessed in O(fib n) (which is exponential) time. You can convince functions to be shared with a memoization library like data-memocombinators.