In this paper, continuous-time optimization methods are studied and a novel gradient-flow scheme is designed that yields convergence to the optimal point of the convex objective function in a fixed time from any given initial point. It is shown that the solutions of the modified gradient flow exist and are unique under certain regularity conditions on the objective function, and Lyapunov-based analysis is used to show fixed-time convergence. The unconstrained optimization problem is considered under two different assumptions, namely, strong-convexity and gradient-dominance, and fixed-time convergence is proven for both cases. Next, a modified Newton's method is presented that exhibits fixed-time convergence under some mild conditions on the objective function. Then, a method for solving convex optimization problems with linear equality constraints is proposed that yields convergence to the optimal point in fixed time. The constrained optimization problems are formulated as min-max problems and a novel method of computing the optimal solution is proposed using the Lagrangian dual. Finally, the general min-max problem is considered and a modified scheme for the saddle-point dynamics is proposed so that the optimal solution can be obtained in fixed time. Numerical illustrations are included to corroborate the efficacy of the proposed methods.