Some checks failed
CI / Test (Go 1.24.x, ubuntu-latest) (push) Successful in 1m53s
CI / Code Quality (push) Failing after 26s
CI / Security Scan (push) Failing after 11s
CI / Test Coverage (push) Successful in 1m13s
CI / Benchmarks (push) Failing after 10m22s
CI / Build CLI (push) Failing after 8s
Benchmarks / Run Benchmarks (push) Failing after 10m13s
Release / Test (push) Successful in 55s
Release / Build (amd64, darwin, ) (push) Failing after 12s
Release / Build (amd64, linux, ) (push) Failing after 6s
Release / Build (amd64, windows, .exe) (push) Failing after 12s
Release / Build (arm64, darwin, ) (push) Failing after 12s
Release / Build (arm64, linux, ) (push) Failing after 12s
Release / Release (push) Has been skipped
CI / Test (Go 1.24.x, macos-latest) (push) Has been cancelled
CI / Test (Go 1.24.x, windows-latest) (push) Has been cancelled
Move hosting from GitHub to private Gitea instance.
Go Jdenticon Examples
This directory contains practical examples demonstrating various usage patterns for the go-jdenticon library.
Examples
concurrent-usage.go
Demonstrates safe and efficient concurrent usage patterns:
- Package-level functions with singleton generator
- Shared generator instances for optimal performance
- Cache performance monitoring
- High-throughput concurrent generation
Run the example:
go run examples/concurrent-usage.go
Run with race detection:
go run -race examples/concurrent-usage.go
The race detector confirms that all concurrent patterns are thread-safe.
CLI Batch Processing
The CLI tool includes high-performance batch processing capabilities:
Create a test input file:
echo -e "alice@example.com\nbob@example.com\ncharlie@example.com" > users.txt
Generate icons concurrently:
go run ./cmd/jdenticon batch users.txt --output-dir ./avatars --concurrency 4
Performance comparison:
# Sequential processing
time go run ./cmd/jdenticon batch users.txt --output-dir ./avatars --concurrency 1
# Concurrent processing (default: CPU count)
time go run ./cmd/jdenticon batch users.txt --output-dir ./avatars
The batch processing demonstrates significant performance improvements through concurrent processing.
Key Takeaways
- All public functions are goroutine-safe - You can call any function from multiple goroutines
- Generator reuse is optimal - Create one generator, share across goroutines
- Icons are immutable - Safe to share generated icons between goroutines
- Caching improves performance - Larger cache sizes benefit concurrent workloads
- Monitor with metrics - Use
GetCacheMetrics()to track performance
Performance Notes
From the concurrent usage example:
- Single-threaded equivalent: ~4-15 icons/sec (race detector overhead)
- Concurrent (20 workers): ~333,000 icons/sec without cache hits
- Memory efficient: ~2-6 KB per generated icon
- Thread-safe: No race conditions detected
The library is highly optimized for concurrent workloads and scales well with the number of CPU cores.