Could someone prove to me that the advice given here (copied below) regarding removing dom elements before altering them and then re-inserting them is ever quicker.
By prove, I would like to see some figures. Its great that they research this but I think the article is very weak without including specifics as to what the 'problem' actually is and how the solution fixes is in terms of speed (as the article title Speeding up JavaScript)
The article....
Out-of-the-flow DOM Manipulation
This pattern lets us create multiple elements and insert them into the DOM triggering a single reflow. It uses something called a DocumentFragment. We create a DocumentFragment outside of the DOM (so it is out-of-the-flow). We then create and add multiple elements to this. Finally, we move all elements in the DocumentFragment to the DOM but trigger a single reflow. The problem
Let's make a function that changes the className attribute for all anchors within an element. We could do this by simply iterating through each anchor and updating their href attributes. The problems is, this can cause a reflow for each anchor.
function updateAllAnchors(element, anchorClass) {
var anchors = element.getElementsByTagName('a');
for 开发者_如何学运维(var i = 0, length = anchors.length; i < length; i ++) {
anchors[i].className = anchorClass;
}
}
The solution
To solve this problem, we can remove the element from the DOM, update all anchors, and then insert the element back where it was. To help achieve this, we can write a reusable function that not only removes an element from the DOM, but also returns a function that will insert the element back into its original position.
/**
* Remove an element and provide a function that inserts it into its original position
* @param element {Element} The element to be temporarily removed
* @return {Function} A function that inserts the element into its original position
**/
function removeToInsertLater(element) {
var parentNode = element.parentNode;
var nextSibling = element.nextSibling;
parentNode.removeChild(element);
return function() {
if (nextSibling) {
parentNode.insertBefore(element, nextSibling);
} else {
parentNode.appendChild(element);
}
};
}
Now we can use this function to update the anchors within an element that is out-of-the-flow, and only trigger a reflow when we remove the element and when we insert the element.
function updateAllAnchors(element, anchorClass) {
var insertFunction = removeToInsertLater(element);
var anchors = element.getElementsByTagName('a');
for (var i = 0, length = anchors.length; i < length; i ++) {
anchors[i].className = anchorClass;
}
insertFunction();
}
You will find it hard to get meaningful figures for this from javascript profiling as you are really on saving repaints and re-flows which won't show up in most profiling tools. You can use the Firebug paint events extension to show you visually how many repaints you're saving.
I put some links on a page and tested the method in the article compared to setting the class name with the elements still in the page. I tried this in Firefox 3, IE 8 and Chrome 3.
I made classes for the links that had different color and different font size. As the link text had different size for the different classes, I was sure that the page really had to be reflowed.
For any reasonable number of links (up to a few thousand), removing and adding the elements is slightly slower.
For an extremely large number of links (10 000), removing and adding the elements is slightly faster.
However, the difference is quite small. You have to have several thousand links to be able to notice any difference at all, and at 10 000 links there is still only something like a 20% difference.
So, what I have found is that you can't expect any dramatic change from this method. If you have performance problems, there is probably other methods that give a much better result. I would for example try changing the class name of the parent element instead of all child elements, and let CSS do the work. Tests I have done before showed that this can be about ten times faster.
This is more or less the same as using documentFragments to initialize elements rather than the dom. The document fragments end up being faster because they have way less structure and actual rendering to worry about.
Here are some of John Resig's notes on the performance benefits of document fragments (which are in use in jquery currently):
http://ejohn.org/blog/dom-documentfragments/
The short answer is that changes to the DOM of the actual page trigger Javascript events, CSS evaluations, propagation of the changes affects on the interpretation of the rest of the DOM around it, etc. Disconnected nodes in flux have nothing like that connected to them, and manipulations on them are much cheaper.
An analogy: If you were an animator working on Toy Story 4, and in a near-final render you saw a change you needed to make in the fabric physics of a scene, would you make your changes while doing full-detail re-renders to inspect the scene, or turn off textures and colors and lower the resolution while you make those changes?
There is no reliable way to tell when a reflow has finished, other than looking at the page.
The time spent calculating is virtually the same for adding or changing elements all at once or one by one. The script doesn't wait for the browser between elements, the browser catches up.
Sometimes you want to see something right away, even if it takes longer to render the whole thing- other times you want to wait a little longer and show it all when it is ready.
If you can't tell which is better, get a designer to look at the one-by-one and all-at-once versions- but don't ask two designers!
精彩评论