Some clever-looking JavaScript patterns secretly hurt performance. Here are 15 common "smart" tricks that actually slow your code down, and the better alternatives you should use.
Introduction
We've all done it.
You discover a new JavaScript trick. It looks clever. It feels elegant. You refactor half your codebase to use it.
Then performance drops.
JavaScript performance problems rarely come from obvious mistakes. They come from well-intentioned patterns used in the wrong context.
In this article, I'll break down 15 JavaScript tricks that can make your code slower, especially in real-world apps (React, Node.js, browser-heavy UIs). More importantly, I'll show what to do instead.
Let's save your CPU from unnecessary pain.
1. Overusing map(), filter(), and reduce() in Chains
The Fancy Way
const result = users
.filter(user => user.active)
.map(user => user.email)
.reduce((acc, email) => [...acc, email], []);Looks clean. But this:
- Iterates the array multiple times
- Creates intermediate arrays
- Adds memory pressure
Faster Alternative (Single Loop)
const result = [];
for (const user of users) {
if (user.active) {
result.push(user.email);
}
}When performance matters (large datasets), fewer loops win.
2. Using Spread in Tight Loops
Problem
let arr = [];
for (let i = 0; i < 10000; i++) {
arr = [...arr, i];
}This creates a new array every iteration.
Better
const arr = [];
for (let i = 0; i < 10000; i++) {
arr.push(i);
}Spread is expressive but expensive inside loops.
3. Re-Creating Functions Inside Loops
Problem
items.forEach(item => {
const process = () => console.log(item);
process();
});You're creating a new function every iteration.
Better
function process(item) {
console.log(item);
}
items.forEach(process);Function creation has a cost, especially in hot paths.
4. JSON Deep Cloning
Classic Trick
const cloned = JSON.parse(JSON.stringify(obj));Problems:
- Slow for large objects
- Breaks Dates, Maps, undefined, functions
- Blocks the main thread
Better
Use structuredClone() (modern environments):
const cloned = structuredClone(obj);Or better yet: rethink whether you need deep cloning at all.
5. Using forEach() When You Need for
forEach() is clean, but:
- Cannot break early
- Slightly slower than classic loops
- Less flexible
Problem
arr.forEach(item => {
if (item === target) return;
});Better
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) break;
}In performance-sensitive code, traditional loops often win.
6. Overusing Optional Chaining in Hot Paths
Optional chaining is amazing:
user?.profile?.settings?.themeBut in deeply nested loops, repeated checks add overhead.
Better Strategy:
Cache references when used repeatedly:
const profile = user.profile;
if (profile) {
const settings = profile.settings;
}Micro-optimizations matter in tight loops.
7. Heavy Object Destructuring in Large Iterations
for (const { name, age, email } of users) {
console.log(name);
}Destructuring creates bindings every iteration.
for (let i = 0; i < users.length; i++) {
console.log(users[i].name);
}Readable? Yes. Fastest? Not always.
8. Using setTimeout(fn, 0) for Async Flow
Old trick:
setTimeout(() => {
heavyTask();
}, 0);This still queues a macrotask and can delay execution.
Better
Use microtasks:
Promise.resolve().then(heavyTask);Or better: use async/await cleanly.
9. Excessive DOM Queries
for (let i = 0; i < 1000; i++) {
document.querySelector("#app").innerHTML += i;
}Each querySelector hits the DOM.
Cache It
const app = document.querySelector("#app");
for (let i = 0; i < 1000; i++) {
app.innerHTML += i;
}DOM access is expensive. Cache references.
10. Frequent Re-Renders in UI Frameworks
In React (for example), inline functions cause re-renders:
<button onClick={() => handleClick(id)}>Click</button>Creates a new function each render.
Better
const handleClick = useCallback((id) => {
...
}, []);Prevent unnecessary re-renders.
11. Using delete on Objects
delete obj.property;This de-optimizes objects internally in JS engines.
Better
obj.property = undefined;Or create a new object without the property.
12. Massive Try/Catch Blocks
Try/catch is slower when thrown frequently.
try {
riskyOperation();
} catch (e) {}If used inside loops, it becomes expensive.
Validate First
if (isValid(data)) {
riskyOperation();
}Avoid exceptions as control flow.
13. Recomputing Expensive Values
for (let i = 0; i < items.length; i++) {
const result = heavyCalculation(data);
}Memoize
const memo = heavyCalculation(data);
for (let i = 0; i < items.length; i++) {
use(memo);
}Cache expensive computations.
14. Using eval()
eval("2 + 2");- Slow
- Unsafe
- Prevents engine optimizations
There is almost always a safer alternative.
15. Over-Abstracting Simple Logic
Abstraction is good; over-abstraction is not.
function executeStrategy(strategy) {
return strategy.execute();
}For trivial logic, this adds call overhead and cognitive cost.
Rule:
Optimize for clarity first. Abstract when necessary.
Why These Slowdowns Happen
JavaScript engines (like V8) optimize predictable patterns.
They struggle when you:
- Constantly recreate objects
- Change object shapes
- Create excessive garbage
- Trigger unnecessary re-renders
- Block the main thread
Performance isn't about cleverness.
It's about stability and predictability.
Key Takeaways
- Avoid unnecessary array chaining in large datasets
- Don't use spread inside loops
- Cache DOM queries
- Minimize object shape changes
- Avoid deep cloning unless required
- Memoize expensive computations
- Use loops wisely in performance-critical code
- Don't use exceptions for normal logic
Most slowdowns come from micro-inefficiencies repeated thousands of times.
Final Thoughts
JavaScript gives us expressive tools.
But not every clever trick is free.
The best developers don't just write elegant code; they write code that scales under pressure.
Before adopting a "smart" pattern, ask:
Is this cleaner or just fancier?
Your users care about speed. Your CPU does too.