For some reason I was thinking since TypedArrays represent sequences of binary signed and unsigned integers, we could use normal operators to add, subtract, multiply, and divide them.
With bitwise operators it doesn't work:
const a = new Uint8Array([0x1, 0x1, 0x1, 0x1])
// 4369, 1000100010001b
const b = new Uint8Array([0x1, 0x1, 0x1, 0x1])
// 4369, 1000100010001b
const c = a >>> 8 // desired outcome: 10001
const d = a & b // desired outcome: 1000100010001
const e = a | b // desired outcome: 1000100010001
const f = a ^ b // desired outcome: 0111011101110
// actual results:
console.log('c:', c) // 0
console.log('d:', d) // 0
console.log('e:', e) // 0
console.log('f:', f) // 0
And neither does it work with arithmetic operators:
const a = new Uint8Array([0x1, 0x1, 0x1, 0x1]) // 4369
const b = new Uint8Array([0x1, 0x1, 0x1, 0x1]) // 4369
const c = a + b
const d = a - b
const e = a * b
const f = a / b
// desired:
console.log(c === new Uint8Array([0x2, 0x2, 0x2, 0x2])) // want: true
console.log(d === new Uint8Array([0x0, 0x0, 0x0, 0x0])) // want: true
console.log(e === new Uint8Array([0x1, 0x1, 0x1, 0x1])) // want: true
console.log(f === new Uint8Array([0x1, 0x1, 0x1, 0x1])) // want: true
// but those are all false, instead:
console.log('c:', c) // 1,1,1,11,1,1,1
console.log('d:', d) // NaN
console.log('e:', e) // NaN
console.log('f:', f) // NaN
I tried using the ternary operators to fix this problem, but that didn't work either. I'm sure I could do some pretty inefficient conversions from Typed Array to number and back, but I imagine there's a better way. I've googled, and browsed the docs and nothing's popping out at me. What's the most efficient way to actually perform bitwise and arithmetic operations between Typed Arrays?