Hi Frank,
A string itself is nothing more than a sequence of bytes (in theory). These bytes have no particular meaning until you decide to interpret them under a particular schema (ASCII, UTF8, UTF16, etc). As such, based on my (very) limited understanding, a sequence of bytes can very well be a valid ASCII or UTF8 string depending on which schema you choose to interpret the byte sequence with. Whether or not this string will make sense, is another very different issue.
I would love to be proven wrong, however.
Regards,
Francisco