Dunkum Posted April 8, 2016 Report Share Posted April 8, 2016 Been a while since I had to deal with the power series forms of Sin and Cos, but I assume the derivation is related to that. I seem to recall Sin being the odd numbered denominators so I assume the derivation is related to that (don't feel like clicking that link right now) 0 Quote Link to comment Share on other sites More sharing options...
Chaos Posted April 8, 2016 Report Share Posted April 8, 2016 http://www.themathpage.com/atrig/double-proof.htm If you were to attempt it, you'd have to apply the sum indentity b times, and then see what you got. Probably not going to be super neat... Well I did some quick Googling. As expected, there is not a formula for this. Division makes things very bad. Dunkum, it is a very simple from the double angle identities, which aren't too bad to derive. No calculus required. 0 Quote Link to comment Share on other sites More sharing options...
Haelbarde Posted April 8, 2016 Report Share Posted April 8, 2016 (Wish we had LaTex or Mathprint or some such thing, support on the forum...) Following the steps for the half angle identity, but instead of starting with cos(2a), starting with cos(3a) instead ends up with: cos(theta) = cos(theta/3)*(4*(cos(theta/3)^2) - 3) (*) In the original case, you'd have: cos(theta) = 2*cos(theta/2) -1 That can be rearranged to: cos(theta/2)^2 = (1 - cos(theta))/2 But that's just not possible with the factor of cos(theta/3) out the front on the RHS in (*). So given it won't work for theta/3, we certainly couldn't find a formula for cos(a/b) for all a & b. 0 Quote Link to comment Share on other sites More sharing options...
Dunkum Posted April 8, 2016 Report Share Posted April 8, 2016 Well I did some quick Googling. As expected, there is not a formula for this. Division makes things very bad. Dunkum, it is a very simple from the double angle identities, which aren't too bad to derive. No calculus required. ahh, been even longer since I had to deal with those, not sure I remember most of my trig identities anymore. 0 Quote Link to comment Share on other sites More sharing options...
Chaos Posted April 8, 2016 Report Share Posted April 8, 2016 ahh, been even longer since I had to deal with those, not sure I remember most of my trig identities anymore. Not a worry. Practically speaking you'd Google them anyway. But in this case, the derivation was so specific that it could be generalized, so it was relevant 0 Quote Link to comment Share on other sites More sharing options...
Dunkum Posted April 8, 2016 Report Share Posted April 8, 2016 yea, if trig identities ever came up in a scenario where I felt compelled to answer something, I would go straight to Google. I only recall about 2 of them (well closer to 6 if you count the simple 1/sin=csc type ones), and I don't think I remember any of the partial angle ones. weird, they were probably the most interesting part of Trig to me too 0 Quote Link to comment Share on other sites More sharing options...
AliasSheep Posted April 9, 2016 Report Share Posted April 9, 2016 Not a worry. Practically speaking you'd Google them anyway. And here I am having to memorise them for exams. :/ 0 Quote Link to comment Share on other sites More sharing options...
Oversleep Posted April 9, 2016 Report Share Posted April 9, 2016 I never bothered to memorise the angle identities (you know, like what is the sin(7/4*pi - x)?). I just imagine the graph and from there it's simple. 0 Quote Link to comment Share on other sites More sharing options...
Hood Posted April 10, 2016 Report Share Posted April 10, 2016 There's a site called betterexplained.com which has a nice article on trigonometric identities. http://betterexplained.com/articles/easy-trig-identities-with-eulers-formula/ The site has many interesting articles on other topics as well. I found it while searching about Bayes' Theorem. I was one of the many who preferred to "learn" the formula. 0 Quote Link to comment Share on other sites More sharing options...
Chaos Posted April 10, 2016 Report Share Posted April 10, 2016 And here I am having to memorise them for exams. :/ When I taught Precalculus I only expected people to memorize some. The sin(a/2) would not be one of those. 0 Quote Link to comment Share on other sites More sharing options...
Lightning Posted April 10, 2016 Report Share Posted April 10, 2016 TOday, I learned about the formula sin(a/2)=(1/2*1-cos a)^1/2. My question is: how could one tweak it for sin(a/b)? Assuming b is a positive integer, then sin(a/b) satisfies a polynomial of degree b in sin(a) and cos(a). As you've already worked out, with degree bigger than 2 the formulas for the roots of such polynomials are (in general) quite complicated. However, when b is just a power of 2, you can use the half-angle formula recursively (just do it over and over), and it only involves repeated square-roots. [The fact that sin(a/3) cannot be written using only square-roots is used to show that trisection of angles is impossible with a straight-edge and compass.] Chaos, didn't know there was another math professor on this site! That's cool. 0 Quote Link to comment Share on other sites More sharing options...
Silverblade5 Posted May 19, 2016 Author Report Share Posted May 19, 2016 I noticed something funny today: according to the power rule, f'(x) for f(x)=x^a is equal to ax^(a-1). If we apply this to the equation x^x, we get x*x^(x-1), Which is also equal to x^x. 0 Quote Link to comment Share on other sites More sharing options...
Oversleep Posted May 19, 2016 Report Share Posted May 19, 2016 (edited) I noticed something funny today: according to the power rule, f'(x) for f(x)=x^a is equal to ax^(a-1). If we apply this to the equation x^x, we get x*x^(x-1), Which is also equal to x^x. That's... not how it works. (x^x)' = x^x * (log(x) + 1) The formula for (x^a)' works for "a"s which are not a function of variable. Edited May 19, 2016 by Oversleep 2 Quote Link to comment Share on other sites More sharing options...
Chaos Posted May 24, 2016 Report Share Posted May 24, 2016 I noticed something funny today: according to the power rule, f'(x) for f(x)=x^a is equal to ax^(a-1). If we apply this to the equation x^x, we get x*x^(x-1), Which is also equal to x^x. As Oversleep was alluding to, the difference between a variable and a constant is pretty important here with formulas at this stage in mathematics. In the context of differentiation rules, x is implicitly an independent variable of a function. It doesn't need to be called x (integration tables often list the independent variable as u for another reason), but that implicit assumption is pretty important. We're also implicitly saying this is a function of just one variable, thus, a cannot be a variable. That's the error you're making: a must be a constant! It's pretty easy how to see this logic is wrong. Consider f(x) = e^x. We know that f'(x) = e^x as well (it's a defining property of the number e). In this, x is the variable and e is a constant, so the power rule does not apply. The power rule only works with the base being a variable and the exponent being a constant. Obviously f'(x) here is not xe^(x-1), because we knew it was just e^x. You can differentiate things of the form of a constant base with a variable exponent (like b^x) by rewriting b^x = e^((lnb)x) and using the chain rule. The question on what to do with x^x, where both are variables, requires a bit more subtle technique called logarithmic differentiation. This method will also work to derive things like x^sin(x), or any arbitrary functions raised to other arbitrary functions. Starting with y = x^x, take the natural log of both sides. ln y = ln(x^x) By properties of logs, ln y = x ln(x) Now we can take the derivative of both sides, using the product rule on the right: ( ln y )' = ln(x) + x*1/x ( ln y )' = ln(x) + 1 Of course, this is the derivative of the log of y, not y itself. But, the derivative of ln(y) for any y(x) is 1/y*y' by the Chain rule. So, to solve for y', all we must do is multiply by y: y' = y (ln(x)+1) Or, y' = (lnx+1)x^x In other words, what we did with this logarithmic differentiation was we changed this difficult exponentiation into multiplication, where we used the product rule. 5 Quote Link to comment Share on other sites More sharing options...
Silverblade5 Posted June 8, 2016 Author Report Share Posted June 8, 2016 Has anyone ever heard of a concept where you create terms for an infinite series by taking a function, evaluating it at a certain point, and then using that output as the next input? 0 Quote Link to comment Share on other sites More sharing options...
Eki Posted June 8, 2016 Report Share Posted June 8, 2016 (edited) Has anyone ever heard of a concept where you create terms for an infinite series by taking a function, evaluating it at a certain point, and then using that output as the next input?I guess you could call that an iterative series or something.This is pretty much what happens when you decide whether or not a point is inside the Mandelbrot set, except you generally only care about if the series tends to infinity, not about the actual series of numbers. I think the Buddhabrot uses all the numbers in the different series though... (up until the absolute value goes above 2 or something, I don't remember) Oh, and the Fibonacci series uses the last two outputs of the series to create every new one, but otherwise it also works like that. Edited June 8, 2016 by Eki 0 Quote Link to comment Share on other sites More sharing options...
Oversleep Posted June 8, 2016 Report Share Posted June 8, 2016 Has anyone ever heard of a concept where you create terms for an infinite series by taking a function, evaluating it at a certain point, and then using that output as the next input? Do you mean recursion? It seems like a recursion, but it's a very basic concept so I'm not sure I understood you properly... 0 Quote Link to comment Share on other sites More sharing options...
Eki Posted June 8, 2016 Report Share Posted June 8, 2016 Do you mean recursion? It seems like a recursion, but it's a very basic concept so I'm not sure I understood you properly... Recursion is kinda... Inverted iterations, in a way. You start at the goal and work towards the beginning, kind of. With iteration you go one step at a time, which fits better with a series. 0 Quote Link to comment Share on other sites More sharing options...
Oversleep Posted June 8, 2016 Report Share Posted June 8, 2016 Recursion is refering to itself, but I see how you're thinking about the "inverted iteration". Most (if not everything) done with recursion can be done with iteration and vice versa. 0 Quote Link to comment Share on other sites More sharing options...
Dunkum Posted June 8, 2016 Report Share Posted June 8, 2016 (edited) yea, that is definitely recursion. the Fibonacci sequence is the classic example, but you can do it for pretty much anything. the Natural numbers: s1=1, sn=sn-1 + 1 perfect squares: s1=1, sn=sn-1 + 2n - 1 edit: though I think the proper math term for what the op is talking about is sequence, not series. to be nitpicky: a sequence is just a (potentially infinite) bunch of terms, a series is when you add them up. Edited June 8, 2016 by Dunkum 0 Quote Link to comment Share on other sites More sharing options...
Eki Posted June 13, 2016 Report Share Posted June 13, 2016 Recursion is refering to itself, but I see how you're thinking about the "inverted iteration". Most (if not everything) done with recursion can be done with iteration and vice versa. Yeah, I've spent a lot of time recently implementing a recursive algorithm on a graphics card, where recursion ranges from impossible to just a bad idea... It can be a bit annoying sometimes. edit: though I think the proper math term for what the op is talking about is sequence, not series. to be nitpicky: a sequence is just a (potentially infinite) bunch of terms, a series is when you add them up. Oh right, I forgot about that! 0 Quote Link to comment Share on other sites More sharing options...
Silverblade5 Posted June 21, 2016 Author Report Share Posted June 21, 2016 Earlier, Chaos was talking about how the logarithmic function is transcendental. In answer to this, I've decided to construct a Maclaurian Series for in(x+1). f(x)=in(x+1) f(0)=0 f '(x)=1/(x+1) f '(0)=1 f ''(x)=-1/(x+1)^2 f ''(0)=-1 f '''(x)=2!/(x+1)^3 f '''(0)=2! f ''''(x)=-3!/(x+1)^4 f ''''(0)=-3! f(x)=summation n=(0,infinite) -1^nx^(n+1)/n! With this, you could calculate the natural log of any function as long as you subtract 1 from the x value afterwords. 0 Quote Link to comment Share on other sites More sharing options...
Asperity Posted June 24, 2016 Report Share Posted June 24, 2016 Any guides or advice on study methodology? This can include anything ranging from how you personally study, proven study methods that show observable and successful results, time spent, materials used, any ideology to force yourself to actually study, so on and so forth. I have severe memory problems and I'd like to spend some time during the next month and a half-ish refreshing myself on material before starting graduate school. 0 Quote Link to comment Share on other sites More sharing options...
Mestiv Posted June 29, 2016 Report Share Posted June 29, 2016 1 Quote Link to comment Share on other sites More sharing options...
Silverblade5 Posted July 1, 2016 Author Report Share Posted July 1, 2016 Has anyone here ever tried creating a function that took the number of rectangles in a Riemann integral as an input and the resulting area as an output? Is there a name for this? 0 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.