Jus开发者_开发技巧t curious to know if there are a list of steps that I can use as guidelines to debug performance issues to pinpoint what is taking the most time. There are a myriad of tools starting with logging, timing methods, load test tools, timing database queries and so on....
considering that there are so many different things, is there a list of things that are at the top of the list.
if so please let me
- Check the machine physically has enough RAM. Paging will kill applications.
- Check what proportion of the application's time is spent in garbage collection. A high proportion means you'll benefit from heap or GC tuning.
- Run the app in a profiler and put it through its paces, monitoring CPU usage. Look for those methods where the CPU spends all its time. A good profiler will let you roll up the time spent in third party code where you have no control, allowing you to identify the hot spots in your own code.
- For the top hot spots in your application, work out where the time is being spent. Is it all I/O? Calculations? Multi-thread synchronisation? Object allocation/deallocation? A poor algorithm with n-squared or worse complexity? Something else?
- Deal with the hot spots one at a time, and each time you change something, measure and work out whether you've actually improved anything. Roll back the ineffective changes and work out where the problem has moved to if the change worked.
There is nothing really specific to Java about something like this, with any language/framework/tool you should follow the same pattern:
- Measure the performance before you change a single thing
- Hypothesize about possible causes/fixes
- Implement the change
- Measure performance after the change to compare with #1
- Repeat until happy
- Measure
- MEASURE!!!!!
- Compare Apples to Apples. Don't run your tests on a busy subnet(especially don't try to justify this ludicrous practice by saying - "I want the circumstances to be realistic")
- Measure- Capture time stamps at each discrete step.
- Note that although there is a relationship, throughput and response time are not the same thing
- After you make a change... MEASURE!!!!! Never say to yourself, it seems better. You know how you know its better? compare measurement 1 to measurement 2
- Test one thing at a time. Don't create one uber performance suite that attempts to simulate realistic conditions. Its too much and you are setting yourself up to be overwhelmed. Test for message size. Test for concurrency. Test in isolation
Once you start to isolate the bottlenecks then the next steps will start to feel more natural, fine tuning your tests will become easier, you may choose at that point to hook up a profiler to investigate GC/CPU perf and memory consumption(VisualVM is good and free).The point is treat performance issues like a binary search. Start by measuring everything and continually subdivide the problem to it reveals itself.
The first and most important step in any kind of perfomance tuning is to identify what is slow, and measure just how slow it is. In most cases (particularly if the performance problem is easy to reproduce), a profiler is the most effective tool for that, as it will give you detailed statistics on execution time, breaking it down to single methods, without having to manually instrument your program.
Check DB queries
Check Statements in loop, statement in loops make application slow instead use prepared statement/callable statements
Capture time stamps at each discrete step
Identify hot spots area where time is being spent, like I/O, Calculations, multithreaded synchronization, garbage collection and look for Poor algorithms.
精彩评论