-| ๐ฏ **Domain** | ๐ **Achievement** | ๐ **Example** |
+| **Domain** | **Achievement** | **Example** |
|---------------|-------------------|----------------|
| **GPU Optimization** | Hardware-optimized kernel discovery | [MLX Metal Kernels](examples/mlx_metal_kernel_opt/) |
| **Mathematical** | State-of-the-art circle packing (n=26) | [Circle Packing](examples/circle_packing/) |
@@ -127,12 +127,12 @@ result = evolve_function(
print(f"Evolved sorting algorithm: {result.best_code}")
```
-**Prefer Docker?** See the [Installation & Setup](#-installation--setup) section for Docker options.
+**Prefer Docker?** See the [Installation & Setup](#installation--setup) section for Docker options.
-## ๐ฌ See It In Action
+## See It In Action
-๐ฅ Circle Packing: From Random to State-of-the-Art
+Circle Packing: From Random to State-of-the-Art
**Watch OpenEvolve discover optimal circle packing in real-time:**
@@ -146,7 +146,7 @@ print(f"Evolved sorting algorithm: {result.best_code}")
-โก GPU Kernel Evolution
+GPU Kernel Evolution
**Before (Baseline)**:
```metal
@@ -174,23 +174,23 @@ kernel void attention_evolved(/* ... */) {
-## ๐งฌ How OpenEvolve Works
+## How OpenEvolve Works
OpenEvolve implements a sophisticated **evolutionary coding pipeline** that goes far beyond simple optimization:

-### ๐ฏ **Core Innovation**: MAP-Elites + LLMs
+### **Core Innovation**: MAP-Elites + LLMs
- **Quality-Diversity Evolution**: Maintains diverse populations across feature dimensions
- **Island-Based Architecture**: Multiple populations prevent premature convergence
- **LLM Ensemble**: Multiple models with intelligent fallback strategies
- **Artifact Side-Channel**: Error feedback improves subsequent generations
-### ๐ **Advanced Features**
+### **Advanced Features**
-๐ฌ Scientific Reproducibility
+Scientific Reproducibility
- **Comprehensive Seeding**: Every component (LLM, database, evaluation) is seeded
- **Default Seed=42**: Immediate reproducible results out of the box
@@ -200,7 +200,7 @@ OpenEvolve implements a sophisticated **evolutionary coding pipeline** that goes
-๐ค Advanced LLM Integration
+Advanced LLM Integration
- **Universal API**: Works with OpenAI, Google, local models, and proxies
- **Intelligent Ensembles**: Weighted combinations with sophisticated fallback
@@ -210,7 +210,7 @@ OpenEvolve implements a sophisticated **evolutionary coding pipeline** that goes
-๐งฌ Evolution Algorithm Innovations
+Evolution Algorithm Innovations
- **Double Selection**: Different programs for performance vs inspiration
- **Adaptive Feature Dimensions**: Custom quality-diversity metrics
@@ -219,15 +219,15 @@ OpenEvolve implements a sophisticated **evolutionary coding pipeline** that goes
-## ๐ฏ Perfect For
+## Perfect For
| **Use Case** | **Why OpenEvolve Excels** |
|--------------|---------------------------|
-| ๐โโ๏ธ **Performance Optimization** | Discovers hardware-specific optimizations humans miss |
-| ๐งฎ **Algorithm Discovery** | Finds novel approaches to classic problems |
-| ๐ฌ **Scientific Computing** | Automates tedious manual tuning processes |
-| ๐ฎ **Competitive Programming** | Generates multiple solution strategies |
-| ๐ **Multi-Objective Problems** | Pareto-optimal solutions across dimensions |
+| **Performance Optimization** | Discovers hardware-specific optimizations humans miss |
+| **Algorithm Discovery** | Finds novel approaches to classic problems |
+| **Scientific Computing** | Automates tedious manual tuning processes |
+| **Competitive Programming** | Generates multiple solution strategies |
+| **Multi-Objective Problems** | Pareto-optimal solutions across dimensions |
## ๐ Installation & Setup
@@ -356,23 +356,23 @@ llm:
-## ๐ธ Examples Gallery
+## Examples Gallery
-### ๐ **Showcase Projects**
+### **Showcase Projects**
| Project | Domain | Achievement | Demo |
|---------|--------|-------------|------|
-| [๐ฏ **Function Minimization**](examples/function_minimization/) | Optimization | Random โ Simulated Annealing | [View Results](examples/function_minimization/openevolve_output/) |
-| [โก **MLX GPU Kernels**](examples/mlx_metal_kernel_opt/) | Hardware | Apple Silicon optimization | [Benchmarks](examples/mlx_metal_kernel_opt/README.md) |
-| [๐ **Rust Adaptive Sort**](examples/rust_adaptive_sort/) | Algorithms | Data-aware sorting | [Code Evolution](examples/rust_adaptive_sort/) |
-| [๐ **Symbolic Regression**](examples/symbolic_regression/) | Science | Automated equation discovery | [LLM-SRBench](examples/symbolic_regression/) |
-| [๐ธ๏ธ **Web Scraper + OptiLLM**](examples/web_scraper_optillm/) | AI Integration | Test-time compute optimization | [Smart Scraping](examples/web_scraper_optillm/) |
+| [**Function Minimization**](examples/function_minimization/) | Optimization | Random โ Simulated Annealing | [View Results](examples/function_minimization/openevolve_output/) |
+| [**MLX GPU Kernels**](examples/mlx_metal_kernel_opt/) | Hardware | Apple Silicon optimization | [Benchmarks](examples/mlx_metal_kernel_opt/README.md) |
+| [**Rust Adaptive Sort**](examples/rust_adaptive_sort/) | Algorithms | Data-aware sorting | [Code Evolution](examples/rust_adaptive_sort/) |
+| [**Symbolic Regression**](examples/symbolic_regression/) | Science | Automated equation discovery | [LLM-SRBench](examples/symbolic_regression/) |
+| [**Web Scraper + OptiLLM**](examples/web_scraper_optillm/) | AI Integration | Test-time compute optimization | [Smart Scraping](examples/web_scraper_optillm/) |
-### ๐ฏ **Quick Example**: Function Minimization
+### **Quick Example**: Function Minimization
**Watch OpenEvolve evolve from random search to sophisticated optimization:**
@@ -388,7 +388,7 @@ def minimize_function(func, bounds, max_evals=1000):
return best_x, best_val
```
-**โ Evolution Process โ**
+**Evolution Process**
```python
# Evolved Program (Simulated Annealing + Adaptive Cooling)
@@ -413,20 +413,9 @@ def minimize_function(func, bounds, max_evals=1000):
### ๐ฌ **Advanced Examples**
-๐จ Prompt Evolution
+Prompt Evolution
-**Evolve prompts instead of code** for better LLM performance:
-
-```yaml
-# Example: HotpotQA dataset
-Initial Prompt: "Answer the question based on the context."
-
-Evolved Prompt: "As an expert analyst, carefully examine the provided context.
-Break down complex multi-hop reasoning into clear steps. Cross-reference
-information from multiple sources to ensure accuracy. Answer: [question]"
-
-Result: +23% accuracy improvement on HotpotQA benchmark
-```
+**Evolve prompts instead of code** for better LLM performance. See the [LLM Prompt Optimization example](examples/llm_prompt_optimization/) for a complete case study with HotpotQA achieving +23% accuracy improvement.
[Full Example](examples/llm_prompt_optimization/)
@@ -683,7 +672,7 @@ system_message: |
**Multi-Phase Evolution:** Start broad ("Explore different algorithmic approaches"), then focus ("Given successful simulated annealing, focus on parameter tuning")
-**Template Stochasticity:** See the [Configuration section](#-configuration) for complete template variation examples.
+**Template Stochasticity:** See the [Configuration section](#configuration) for complete template variation examples.
@@ -892,8 +881,8 @@ If you use OpenEvolve in your research, please cite:
### **๐ Ready to evolve your code?**
-**Made with โค๏ธ by the OpenEvolve community**
+**Maintained by the OpenEvolve community**
-*Star โญ this repository if OpenEvolve helps you discover breakthrough algorithms!*
+*If OpenEvolve helps you discover breakthrough algorithms, please consider starring this repository.*
From 5fc9fd41818ec6410d7165a9347ec8b6fb9378b6 Mon Sep 17 00:00:00 2001
From: Asankhaya Sharma