## Big O of multiplying 2 complex numbers?

gauss complex multiplication
matrix inversion complexity

What is the time complexity for multiplying two complex numbers? For example (35 + 12i) *(45 +23i)

The asymptotic complexity is the same as for multiplying the components.

(35 + 12i) * (45 + 23i) == 35*45 + 45*12i + 35*23i - 12*23
== (35*45 - 12*23) + (45*12 + 35*23)i


You just have 4 real multiplications and 2 real additions.

So, if real multiplication is O(1), so is complex multiplication.

If real multiplication is not constant (as is the case for arbitrary precision values), then neither is complex multiplication.

Computational complexity of mathematical operations, See big O notation for an explanation of the notation used. Note: Due to the variety of multiplication algorithms, M(n) below stands in for the complexity of the � Once the numbers are computed, we need to add them together (steps 4 and 5), which takes about n operations. Karatsuba multiplication has a time complexity of O(n log 2 3) ≈ O(n 1.585), making this method significantly faster than long multiplication.

If you multiply two complex numbers (a + bi) and (c + di), the calculation works out to (ac - bd, adi + bci), which requires a total of four multiplications and two subtractions. Additions and subtractions take less time than multiplications, so the main cost is the four multiplications done here. Since four is a constant, this doesn't change the big-O runtime of doing the muliplications compared to the real number case.

Let's imagine you have two numbers n1 and n2, each of which is d digits long. If you use the grade-school method for multiplying these numbers together, you'd do the following:

for each digit d1 of n2, in reverse:
let carry = 0
for each digit d2 of n1, in reverse:
let product = d1 * d2 + carry
write down product mod 10
set carry = product / 10, rounding down

add up all d of the d-digit numbers you wrote in step 1


That first loop runs in time Θ(d2), since each digit in n2 is paired and multiplied with each digit of n1, doing O(1) work apiece. The result is d different d-digit numbers. Adding up those numbers will take time Θ(d2), since you have to scan each number of each digit exactly once. Overall, this takes time Θ(d2).

Notice that this runtime is a function of how many digits are in n1 and n2, rather than n1 and n2 themselves. The number of digits in a number n is Θ(log n), so this runtime is actually O((log max{n1, n2})2) if you're multiplying two numbers n1 and n2.

This is not the fastest way to do multiplications, though for a while there was a conjecture that it was. Karatsuba's algorithm runs in time O((log max{n1, n2})log3 4), where the exponent is around 1.7ish. There are more modern algorithms that run even faster of this, and it's an open problem whether it can be done in time O(log d) with no exponent!

Multiplication algorithm, A multiplication algorithm is an algorithm (or method) to multiply two numbers. Depending on Thus both these values can be stored in O(log n) digits. that it takes more steps than long multiplication, so it can be unwieldy for large numbers . Complex multiplication normally involves four multiplications and two additions. Video Tutorial on Multiplying Complex Numbers. Example 1. Let's multiply the following 2 complex numbers $$\bf{ (5 + 2i) (7 + 12i)}$$ Step 1 Foil the binomials.

Multiplying two complex numbers only requires three real multiplications.

Let p = a * c, q = b * d, and r = (a + b) * (c + d).

Then (a + bi) * (c + di) = (p - q) + i(r - p - q).