Please use this identifier to cite or link to this item: https://doi.org/10.3722/cadaps.2011.301-313
Title: A neural network based approach to 5-axis tool-path length estimationfor optimal multi-cutter selection
Authors: Geng, L.
Zhang, Y.F. 
H Fuh, J.Y.
Keywords: Five-axis machining
Multi-cutter selection
Neural network
Tool-path length
Issue Date: 2011
Citation: Geng, L.,Zhang, Y.F.,H Fuh, J.Y. (2011). A neural network based approach to 5-axis tool-path length estimationfor optimal multi-cutter selection. Computer-Aided Design and Applications 8 (2) : 301-313. ScholarBank@NUS Repository. https://doi.org/10.3722/cadaps.2011.301-313
Abstract: Compared to single-cutter machining, using multiple cutters in 5-axis finish machining of freeform surfaces can produce shorter tool-paths; hence the increased machining efficiency. In our previous work, a method to evaluate a cutter's accessibility at any point on a machining surface has been developed. In this paper, this method is used to identify feasible cutters and construct their machining regions. These cutters can make up many cutter combinations that can finish the entire machining surface, among which there will be an optimal set that produces the shortesttool-path. To find this optimal combination, we propose to use the tool of neural network to predict the tool-path length for a machining regionwithout actually generating the tool-path. The neural network is trained extensively with a large set of carefully designed training data extracted from actual machining jobs. Finally the validityof our method is proved with testing data sets that have never been exposed to the neural network before. © 2011 CAD Solutions, LLC.
Source Title: Computer-Aided Design and Applications
URI: http://scholarbank.nus.edu.sg/handle/10635/51301
ISSN: 16864360
DOI: 10.3722/cadaps.2011.301-313
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.