woelen
Super Administrator
Posts: 8012
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
math help
I could help with some math for YT2095, but now I also can use some help with math .
I need to solve a quadratic equation, in an algebra with multiplication which does not commute, i.e. a*b is not equal to b*a.
The equation to be solved is a simple quadratic one:
x*a*x + b*x + c = 0
Here x is the unknown, a has an inverse a', such that a'*a = a*a' = 1.
How could this be solved? I'm totally stuck.
This of course is a very simple question when for all a and b, a*b = b*a, but when this may not be assumed anymore, then things become totally
different. Any clue?
|
|
YT2095
International Hazard
Posts: 1091
Registered: 31-5-2003
Location: Just left of Europe and down a bit.
Member Is Offline
Mood: within Nominal Parameters
|
|
Quote: | Originally posted by woelen
in an algebra with multiplication which does not commute, i.e. a*b is not equal to b*a.
|
How can this be?
unless One of the variables either A or B has a Function as well as a Number attatched to it, or a Priority Function like Brackets with an equation
inside.
I`m making no attempt to try and answer this (I think we all Know how crap I am at maths!) I`m asking because I`m Curious too
\"In a world full of wonders mankind has managed to invent boredom\" - Death
Twinkies don\'t have a shelf life. They have a half-life! -Caine (a friend of mine)
|
|
12AX7
Post Harlot
Posts: 4803
Registered: 8-3-2005
Location: oscillating
Member Is Offline
Mood: informative
|
|
Is there also a division operator?
Non-commuting multiplication is quite common in linear algebra, when multiplying vectors and matrices and such.
[Edited on 11-14-2007 by 12AX7]
|
|
woelen
Super Administrator
Posts: 8012
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
Yes, there is a division operator, but a left and a right division.
E.g. if a*b = c, then b = a'*c (provided a has an inverse, i.e. a is not equal to 0).
But if b*a = c, then b = c*a'.
@YT2095: In normal arithmetic as we all learn at school, multiplication always has the property a*b = b*a, for all a and b. But there are many
algebra's where this nice property does not hold (sadly). 12AX7 already mentioned an example. My question indeed comes from linear algebra, but not
over the field of real numbers, but over a finite field GF(p^n), not to be confused with the ring Z/p^nZ, which is not even a field (the latter has an
a and b, such that a and b both are non-zero, while the product a*b equals 0). But that does not matter, the question preferrably should be answered,
without referring to some specific algebra.
[Edited on 14-11-07 by woelen]
|
|
pantone159
National Hazard
Posts: 590
Registered: 27-6-2006
Location: Austin, TX, USA
Member Is Offline
Mood: desperate for shade
|
|
Not sure how helpful this is, but my thoughts so far...
You want to solve: xax + bx + c = 0, which would be the normal quadratic equation if things commuted.
I assume that addition commutes, so a+b = b+a, and normal constant multiplication also commutes, so 2ab = a2b = ab2.
Is that true?
Approaching this like a normal quadratic:
Step 1 - Move c over
xax + bx = -c
Step 2 - Premultiply by 4a
4axax + 4abx = -4ac
Step 3 - Add b*b
4axax + 4abx + b*b = b*b - 4ac
Now, if this were with normal numbers, then the left hand side is equal to:
LHS = (2ax + b)^2
which would then give you:
(2ax + b)^2 = b*b - 4ac
which you solve for x to get the usual quadratic equation.
But, in your case,
(2ax + b)*(2ax + b) =
(2ax)(2ax) + (2ax)(b) + (b)(2ax) + (b)(b) =
4axax + 2axb + 2bax + b*b
This compares to the left side of Step 3:
4axax + 4abx + b*b (LHS 3)
4axax + 2axb + 2bax + b*b (yours)
These are equal if:
2abx = axb + bax
Does that happen to be the case for you? If so, then the usual formula would work.
Otherwise, I'm not sure.
|
|
woelen
Super Administrator
Posts: 8012
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
Pantone, thanks for your help, but the last point is exactly where I get stuck. I cannot assume that a*x*b = b*a*x and so, the entire mechanism for
solving the equation breaks down . What you give is a derivation of the well-known
formula for solving quadratic equations, such as this is done at high school.
I've done quite some algebraic fiddling around, and I also searched on the Internet, but no success till now. Most likely this kind of equations has a
special name. If one knows that name, then that could be helpful also, because searching would be facilitated.
|
|
pantone159
National Hazard
Posts: 590
Registered: 27-6-2006
Location: Austin, TX, USA
Member Is Offline
Mood: desperate for shade
|
|
Any chance of that last bit being off by a constant?
I.e., 2abx = axb + bax + D, for some constant D.
If that were true, you'd win as well.
Maybe you could solve it iteratively?
E.g., Newton's method, for normal numbers at least, is
x1 = x0 - f(x0) / f'(x0), where f' is the derivative.
For your f(x) = xax + bx + c, f'(x) = ax + xa + b.
I'm not sure if the arithmetic has to be adjusted for your stuff.
|
|
Geomancer
Hazard to Others
Posts: 228
Registered: 21-12-2003
Member Is Offline
Mood: No Mood
|
|
If you're working with a division algebra over a finite field, it must be infinite dimensional, no? I can't say I've ever worked with these guys, let
alone seen a solution for the problem. I can construct simple examples, but perhaps you could fill us in on how you came by this creature, and exactly
what it looks like?
|
|
woelen
Super Administrator
Posts: 8012
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
This is a problem from cryptography. Right now, I cannot go into very strong detail, but it has to do with computing quadratic residues, but not for
numbers, but for square matrices. The problem of computing quadratic residues is a big one on its own, but when done modulo a prime things are much
less demanding.
So, the elements a,b,c and x are matrices with elements of a finite field (GF(p^n)). These are discrete fields of characteristic p, and hence, using
Newton-like methods are no option over here.
Finally, after searching with quite some effort, I found some material about this subject, but it does not look promising:
https://kb.osu.edu/dspace/bitstream/1811/22234/1/V074N5_273
Apparently, the general solution of this type of equations is not (yet) solved, although I must admin that the info in that paper is more than 30
years old, so things may have changed, but not something I know of. So, I hit something very difficult. I am really amazed that this seemingly simple
problem is not yet solved.
So, I am not surprised if you cannot come up with a solution. Probably, I have to dive into the crypto-stuff again and change direction of (re)search.
|
|
Geomancer
Hazard to Others
Posts: 228
Registered: 21-12-2003
Member Is Offline
Mood: No Mood
|
|
Quote: | Originally posted by woelen
Yes, there is a division operator, but a left and a right division.
E.g. if a*b = c, then b = a'*c (provided a has an inverse, i.e. a is not equal to 0).
[Edited on 14-11-07 by woelen] |
I see. That was the bit that confused me. I assume you mistyped, since nonzero matrices are not, in general, invertible.
Note that your problem, as I understand it, is not exactly analogous to the classical one. You seem to be interested in determining the existence of a
solution within the base algebra, whereas the classical problem involves constructing a field that contains all solutions.
|
|
chemrox
International Hazard
Posts: 2961
Registered: 18-1-2007
Location: UTM
Member Is Offline
Mood: LaGrangian
|
|
Question: Is this a matrix algebra or some other linear calculus?
"When you let the dumbasses vote you end up with populism followed by autocracy and getting back is a bitch." Plato (sort of)
|
|
DerAlte
National Hazard
Posts: 779
Registered: 14-5-2007
Location: Erehwon
Member Is Offline
Mood: Disgusted
|
|
Woelen - I came across this which you posted some time ago and I obviously missed. It intrigued me and I looked into it. You wrote:
Quote: | I need to solve a quadratic equation, in an algebra with multiplication which does not commute, i.e. a*b is not equal to b*a. The equation to be
solved is a simple quadratic one:
x*a*x + b*x + c = 0
Here x is the unknown, a has an inverse a', such that a'*a = a*a' = 1.
How could this be solved? I'm totally stuck….
……Yes, there is a division operator, but a left and a right division. E.g. if a*b = c, then b = a'*c (provided a has an inverse, i.e. a is not
equal to 0).
But if b*a = c, then b = c*a'….
….but over a finite field GF(p^n), not to be confused with the ring Z/p^nZ, which is not even a field (the latter has an a and b, such that a and b
both are non-zero, while the product a*b equals 0). But that does not matter, the question preferrably should be answered, without referring to
some specific algebra…. – (italics mine)
…So, the elements a,b,c and x are matrices with elements of a finite field (GF(p^n)). These are discrete fields of characteristic p
|
Are you still interested in this or have you found a solution? I can solve it, I think, for certain conditions, but I have to re-interpret your
equation to make sense of it in a matrix fashion. In particular, I re-write it as:
x’*a*x + b’*x + c = 0; else I cannot interpret it.
I have written up a solution but cannot write it in this very limited font available on the forum. It is a few pages long, in WORD DOC format. I will
post it via ftp if you are still interested. I am still wrestling over its validity in GF(p^n) – my abstract algebra is a bit sketchy after 20
years of disuse! I used to use linear algebra in all sorts of applications from circuit theory to pattern recognition, but only touched lightly on
abstract during the period I was interested in coding for error correction and crypto. It intrigues me but the excessive terminology is daunting….
Regards,
Der Alte
|
|
woelen
Super Administrator
Posts: 8012
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
Der Alte, I certainly am interested in the solution of your reformulated problem. With a' you mean the transpose of a matrix, or do you mean the
conjugate transpose of a matrix? For real matrixes this point is moot, but for complex matrixes it does matter.
|
|
sparkgap
International Hazard
Posts: 1234
Registered: 16-1-2005
Location: not where you think
Member Is Offline
Mood: chaotropic
|
|
You might want to look up the "quadratic eigenvalue problem" and "matrix solvent" if it's going to branch into linear algebra.
sparky (~_~)
"What's UTFSE? I keep hearing about it, but I can't be arsed to search for the answer..."
|
|
microcosmicus
Hazard to Others
Posts: 287
Registered: 31-12-2007
Member Is Offline
Mood: spin up
|
|
Since you state that a is invertible, we can make a preliminary
simplification. Letting a' denote the inverse of a and defining
y = ax, your equation becomes
a'*y*y + b*a'*y + c = 0 .
We can multiply through by a and define B = a*b*a' and
C = a*c to then obtain the equation
y*y + B*y + C = 0 .
In other words, it is at least possible to eliminate a from consideration.
Unfortunately , this is about all one can do at this level of generality ---
your request "the question preferrably should be answered, without
referring to some specific algebra." doesn't look like it can be fulfilled in
full generality because there are too many non-commutative algebras which
behave in all sorts of ways. By using generators and relations, one can cook
up an algebra in which your equation has pretty much any type of solutions.
If we take a free algebra, the equation x*x + b*x + c = 0
obviously has no solutions because a solution would be a relation
but, by definition, free algebras have no relations. If we take an
algebra with generators a,b,c and the relation a*a*+b*a+c=0.
Likewise, we could concoct algebras in which the equation has any
number of solutions, finite or infinite or in which solutions happen to
equal a specified expression in b and c or no such expression.
Therefore, to make progress, it seems we need to narrow the class of
algebras under consideration. Especially since that is where your question
arose, the class of matrix algebras whose coefficients lie in a field is a
good choice. To stay general, I will not specify the field, but do things
which would be valid for all fields.
To begin, we can at least do something like completing the square to at
least eliminate the trace of b, if not all of b. Set x = y -
(1/4) (tr b) I. (Here, I is the identity matrix.) Then, upon expanding,
we obtain the following equation:
y*y + (b - (1/2) (tr b) I)*y + c - (1/4) (tr b) b + (1/16) (tr b)^2 I = 0
Thus, without loss of generality, we may assume that the trace of b
is zero.
A key observation is that the form of our equation is invariiant under
conjugation. Suppose we set x = m*y*m', where m is
a matrix with inverse m'. Then we see that
y*y + (m'*b*m)*y + m'*c*m = 0 .
Thus, the number and nature of the solutions is invariant under conjugation
and it makes sense to make use of this fact to simplify the problem. One
way is to put the matrices into a canonical form. If the field is algebraically
complete, then one can put one of the matrices, say b, into Jordan
normal form and simplify the other matrix some. However, since this is
not the case for the fields which interest you, I will not carry this idea
further now.
Another idea is to make use of invariants and covariants. The traces
of products of matrices are invariant under conjugation; by the
Cayley-Hamilton theorem, one can express all such quantities in
terms of a finite number of them. Then one could classify the solutions
of your equation for matrices of a certain degree by what these
invariants are doing. Likewise, one could express the solutions
on a covariant form. For the fun of it, I am doing this in the 2x2 case;
since it is quite tedious to write math in this forum, I will stop here
and soon post s TeX document with this analysis of the 2x2 case.
BTW, DerAlte, could you also post what you did, or send me a copy?
[Edited on 1-2-2008 by microcosmicus]
|
|
DerAlte
National Hazard
Posts: 779
Registered: 14-5-2007
Location: Erehwon
Member Is Offline
Mood: Disgusted
|
|
For what it's worth, folks:
http://www.sciencemadness.org/scipics/Solve xT.doc
Purely formal solution. Was going to add a bit but ran out of steam: comments and criticism invited! I have not given any serous thought to validity
in other than field R.
@Woelen: A' is the normal transpose. I assumed X was real and not complex or Hermitian.
Regards,
DerAlte
[Edited on 1-2-2008 by DerAlte]
|
|
woelen
Super Administrator
Posts: 8012
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
If I read the article, then I see that you are not actually solving the equation you mention (it cannot, because it has N unknowns, while there only
is one equation), but you reduce it to a purely quadratic form.
I think that this method can be used to transform N equations of the form x'Ax + b'x + c = 0 to N purely quadratic equations, allowing these N
equations to be solved as a linear system in N purely quadratic forms. I'll have to go deeper into this matter. Der Alte, your article gives some nice
ideas to me and if I find something interesting based on that, I'll come back on that.
|
|
Geomancer
Hazard to Others
Posts: 228
Registered: 21-12-2003
Member Is Offline
Mood: No Mood
|
|
A summary of DerAlte's method for those so inclined:
The equation x<sup>t</sup>Ax+b<sup>t</sup>x+C=0 can be written as < x, x>+<
b',x>+c=0 where <,> is a symmetric bilinear function. This equation can be reduced to the form <
y,y>=k by a method entirely analogous to that used to solve the quadratic equation. Note that this holds for all fields of characteristic
other than 2. (For characteristic 2 the form <,> might not exist, and completing the square also fails). <
y,y>=k is equivalent to the system sum(ax<sub>i</sub><sup>2</sup>=k where a<sub>i</sub> are either +/- 1 or non-quadratic residues (look up "diagonalization of
quadratic forms" for more information).
Nice math, but I'm not sure it helps with the original problem. My approach would be to try and find the invariant subspaces of the solution (X,
considered as an operator on a vector space). Microcosmicus conveniently reduces the problem to X<sup>2</sup>+BX+C=0.
Using this form of the problem, note that if a space I is X invariant (i.e. XI=I), then BI+I=CI+I. A variant of the Microcosmicus
trace elimination technique may allow one to find the spaces I from B and C. This should be enough to solve over an algebraicly complete field. For
finite fields, then, one could extend the field to a completion, solve, and discard the solutions that aren't in the original. That's all I have; I
actually should be working on other things, but if a solution pops into my head uninvited I'll post it.
|
|
DerAlte
National Hazard
Posts: 779
Registered: 14-5-2007
Location: Erehwon
Member Is Offline
Mood: Disgusted
|
|
@Woelen - absolutely. I am having a look at the case where X is also an nxn matrix. There you have N^2 equations and N^2 unknowns and, in principle
there is a solution. But it's damned complex....
@Geomancer and Microcosmicos - interesting posts, I need more time to digest. The old brain has slowed down a bit these days.
I intend to repost the Solve xT.doc screed with a few additions soon, when I've sorted some ideas out.
Regards,
Der Alte
|
|
DerAlte
National Hazard
Posts: 779
Registered: 14-5-2007
Location: Erehwon
Member Is Offline
Mood: Disgusted
|
|
A bit added at the end (titled addendum), a stab at the NxN case for X. The rest unaltered.
http://www.sciencemadness.org/scipics/Solve_xT.doc
Der Alte
|
|