Taylor Series Derivative Approximations
This is what I am trying to achieve:
Write a MATLAB algorithm that evaluates
y′(t) = f(y,t)
y(a) = yo
using a first order Taylor series to approximate the derivative. Test your code by setting
f(y,t) = y − t^2 + 1, with y(0) = 0.5,
and comparing your results to the analytical solution. For your records, in 150 words or less as
comments in the code, report the key steps in the code and the step size
used to achieve a reasonably accurate answer.
This is what I have:
syms x y t;
a=input(‘central point: ‘);
f=input(‘f(x)=’);
v1=input(‘variable 1: ‘);
v2=input(‘variable 2: ‘);
taylor(f,[v1,v2],a,’Order’,2)
So I’ve written an algorithm that gives the first order taylor series of an input function. I don’t know how to use this to approximate the derivative. Any direction would be helpful.This is what I am trying to achieve:
Write a MATLAB algorithm that evaluates
y′(t) = f(y,t)
y(a) = yo
using a first order Taylor series to approximate the derivative. Test your code by setting
f(y,t) = y − t^2 + 1, with y(0) = 0.5,
and comparing your results to the analytical solution. For your records, in 150 words or less as
comments in the code, report the key steps in the code and the step size
used to achieve a reasonably accurate answer.
This is what I have:
syms x y t;
a=input(‘central point: ‘);
f=input(‘f(x)=’);
v1=input(‘variable 1: ‘);
v2=input(‘variable 2: ‘);
taylor(f,[v1,v2],a,’Order’,2)
So I’ve written an algorithm that gives the first order taylor series of an input function. I don’t know how to use this to approximate the derivative. Any direction would be helpful. This is what I am trying to achieve:
Write a MATLAB algorithm that evaluates
y′(t) = f(y,t)
y(a) = yo
using a first order Taylor series to approximate the derivative. Test your code by setting
f(y,t) = y − t^2 + 1, with y(0) = 0.5,
and comparing your results to the analytical solution. For your records, in 150 words or less as
comments in the code, report the key steps in the code and the step size
used to achieve a reasonably accurate answer.
This is what I have:
syms x y t;
a=input(‘central point: ‘);
f=input(‘f(x)=’);
v1=input(‘variable 1: ‘);
v2=input(‘variable 2: ‘);
taylor(f,[v1,v2],a,’Order’,2)
So I’ve written an algorithm that gives the first order taylor series of an input function. I don’t know how to use this to approximate the derivative. Any direction would be helpful. tayorseries, differentialequation, homework MATLAB Answers — New Questions