This research focuses on the use of Light Detection and Ranging (LiDAR) sensors for robot localization. One of the most essential algorithms in LiDAR localization is the breakpoint detector algorithm which is used to determine the corner of the room. The previously developed breakpoint detection methods have weaknesses, such as the Adaptive Breakpoint Detector (ABD), could generate dynamic threshold values. The ABD results, on the other hand, still require Line Extraction to obtain the corner breakpoint. Line Extraction method, e.g. IterativeEnd Point Fit (IEPF), is used to categorize data, resulting in the generation of a line pattern as an interpretation of a wall. The computational method for obtaining the corner breakpoint becomes longer as the line is extracted. To address this issue, our algorithm proposes a new threshold area in the form of an ellipse with the threshold value parameter obtained from previously identified room size and sensor characteristics. As a result the corner breakpoint detection becomes more adaptive. The goal of this research is to create an Adaptive Line Tracking Breakpoint Detector (ALTBD) approach that will reduce the computing time required to detect corner breakpoints. Furthermore, the Line Extraction method required for corner breakpoint detection is modified in the ALTBD. To distinguish between the edge of the wall and the corner of the room, the boundary value is increased. The ALTBD method was tested in a simulation arena comprised of multiple rooms and halls. According to the results, the ALTBD computation time is faster in detecting corner breakpoints than the ABD IEPF method, also the accuracy for determining the position of the robot was improved.