A question of history
I may be wrong perhaps I am the first mathematician to discover the following property of polynomials:
Let f(x) be a polynomial in x ( x belongs to Z, can be a Gaussian integer, or be a square matrix in which the elements are rational integers or Gaussian integers). Then f(x + k*f(x)) = = 0 mod(f(x)). 
x+k*f(x) == x (mod f(x))
==> f(x+k*f(x)) == f(x) (mod f(x)). QED Looks like a trivial result. 
[QUOTE=axn;459970]x+k*f(x) == x (mod f(x))
==> f(x+k*f(x)) == f(x) (mod f(x)). QED Looks like a trivial result.[/QUOTE]Except at roots of f(x). 
[QUOTE=axn;459970]x+k*f(x) == x (mod f(x))
==> f(x+k*f(x)) == f(x) (mod f(x)). QED Looks like a trivial result.[/QUOTE] Agreed the result is not earthshaking. However it has a couple of interesting applications, one of which I would like to mention in this post: Indirect primality testing. Let f(x) be a quadratic polynomial in x ( x belongs to Z). For example let f(x) be x^2 + x +1. All x, other than those generated by 1 + 3k, 2 + 7k, 3 + 13k, 4 + 3k and 4 + 7k, 5 + 31k.....are such that f(x) is prime which need not be tested for primality. Unfortunately this is true only upto quadratic level. 
[QUOTE=devarajkandadai;459961]I may be wrong perhaps I am the first mathematician to discover the following property of polynomials:
Let f(x) be a polynomial in x ( x belongs to Z, can be a Gaussian integer, or be a square matrix in which the elements are rational integers or Gaussian integers). Then f(x + k*f(x)) = = 0 mod(f(x)).[/QUOTE] Since A  B is an algebraic factor of A^n  B^n for every nonnegative integer n, we have that if f(x) is a polynomial in [b]K[/b][x], where [b]K[/b] is a field, then A  B is an algebraic factor of f(A)  f(B). I imagine this has been known for centuries; I'm pretty sure Isaac Newton knew it, certainly for the cases where [b]K[/b] is the rational or real numbers. Of course, the result continues to hold in cases where [b]K[/b] is [i]not[/i] a field, but I'm not sure offhand just how far you can push it. If [b]K[/b] is a commutative ring (with 1) I don't see any reason it wouldn't work. In particular, substituting x + k*f(x) for A and x for B, k*f(x) is an algebraic factor of f(x + k*f(x))  f(x). 
[QUOTE=Dr Sardonicus;460055]Since A  B is an algebraic factor of A^n  B^n for every nonnegative integer n, we have that if f(x) is a polynomial in [b]K[/b][x], where [b]K[/b] is a field, then
A  B is an algebraic factor of f(A)  f(B). I imagine this has been known for centuries; I'm pretty sure Isaac Newton knew it, certainly for the cases where [b]K[/b] is the rational or real numbers. Of course, the result continues to hold in cases where [b]K[/b] is [i]not[/i] a field, but I'm not sure offhand just how far you can push it. If [b]K[/b] is a commutative ring (with 1) I don't see any reason it wouldn't work. In particular, substituting x + k*f(x) for A and x for B, k*f(x) is an algebraic factor of f(x + k*f(x))  f(x).[/QUOTE] Merely saying " I am pretty sure Isaac....." will not do; can you quote any paper or book where in either Newton, Euler or any mathematician has mentioned this result? 
[QUOTE=devarajkandadai;459961]I may be wrong perhaps I am the first mathematician to discover the following property of polynomials:
Let f(x) be a polynomial in x ( x belongs to Z, can be a Gaussian integer, or be a square matrix in which the elements are rational integers or Gaussian integers). Then f(x + k*f(x)) = = 0 mod(f(x)).[/QUOTE] You are not the first mathematician, i discovered it also, a little bit earlier than you, may be 10 years ago: see [URL]http://devalco.de/quadr_Sieb_x%5E2+1.php#1[/URL] :cool:, and i do not claim to be the first. But it is indeed a good basic idea for prime generators, if you add the proof : f(x  k*f(x)) = = 0 mod(f(x)) (or k element Z) you have a good criteria for prime generators. Have a look at [URL]http://devalco.de/#106[/URL] and you will discover a little bit more of prime numbers or prime generators in quadratic progression. Nice Greetings from the primes :pals: Bernhard 
[QUOTE=devarajkandadai;460189]Merely saying " I am pretty sure Isaac....." will not do; can you quote any paper or book where in either Newton, Euler or any mathematician has mentioned this result?[/QUOTE]The result is so trivial that any selfrespecting mathematician would not even think of publishing it  especially so because it is incorrect as you first stated it (see my subsequent correction).

[QUOTE=devarajkandadai;460189]Merely saying " I am pretty sure Isaac....." will not do; can you quote any paper or book where in either Newton, Euler or any mathematician has mentioned this result?[/QUOTE]This will not do. [i]You[/i] are making a claim of priority. It is incumbent on [i]you[/i] to check the literature. I suggested Newton, whose work with polynomials is well known, both WRT derivatives of powers and "Newton's identities," but you refused to look. It is reasonable to conclude that won't look because you're afraid of what you might find.
I [i]do[/i] know that in high school algebra, one of the exercises for learning mathematical induction was to prove that, for any positive integer n, a  b divides a^n  b^n. And while I will [i]not[/i] claim that back then we were doing our homework with a stylus on damp clay, I [i]will[/i] say that it was quite a number of years ago. So a result of which (a corrected form of) the one you claim is a trivial corollary, was relegated to the exercises in high school algebra long since. No mathematician worthy of the name would presume to claim it as an original result. The result I mention is also often used to prove the formula for summing a geometric series. That's been known for a while, too. 
All times are UTC. The time now is 04:11. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.