Galaxy Editor Final Deliverable
Work by: Sultan Jamalbekov, Ryan Trac, Ruben Gonzalez, Alex Benny
Link to project deployment:Project Deployment
Link to deliverable website: Deliverable Website
Link to GitHub repository: Repository
Link to Google Drive with project results: Google Drive
Technical Approach
Starter Code Functionality
- Interactive real-time render of a galaxy: move, pan, zoom.
- Two types of objects: star and haze.
- The galaxy is generated statically.
Our Implementation
- Interactive real-time render of a galaxy: move, pan, zoom
- Three types of objects: star, haze, nebula
- Galaxy is generated based on parameters that could be adjusted by the user in real time
- Galaxy can be generated by 2D input drawn by the user on a canvas
- Various artistic post-processing effects
- Ability to produce render of a scene
- Realistic Galaxy rotation
Ray Tracing Render
- Techniques Used: Ray tracing was implemented to generate PNG renders of the galaxy scene. Rays were cast per pixel from the camera through a perspective projection, intersecting spheres representing stars, nebulae, and haze. Diffuse shading with ambient lighting was applied using the dot product of surface normals and light directions, as described in Fundamentals of Computer Graphics (Shirley et al., 2009).
- Algorithms Implemented:
- Vector operations (
addition, subtraction, scaling, normalization, cross product) were defined for ray calculations.
- Sphere intersection was computed using the quadratic formula to solve for ray-sphere hit distances.
- The
trace function identified the closest intersection and computed colors via computeLighting, incorporating point and hemisphere lights.
- A BVH (Bounding Volume Hierarchy) was used to optimize intersection tests, as per Physically Based Rendering (Pharr et al., 2016).
- Deviations from References: Unlike full path tracing in Pharr et al., a simplified ray tracer was implemented, focusing on primary rays and diffuse shading without reflections or refractions, prioritizing performance for large scenes.
- Unique Decisions:
- Stars, nebulae, and haze were modeled as spheres with varying radii, balancing visual fidelity and computational efficiency.
- The
renderRaytracePNG function in main.js was designed to output a PNG, ensuring manageable render times while maintaining quality.
- Rationale: Simplified ray tracing was chosen to achieve high-quality stills without requiring GPU-accelerated path tracing, making the feature accessible in a browser-based environment.
Procedural Generation
- Techniques Used: Procedural generation was enhanced to create galaxies with stars, haze, and nebulae, using Gaussian distributions for core/bar and logarithmic spirals for arms, as inspired by Procedural Content Generation in Games (Shaker et al., 2016). A 2D canvas input can be processed into a 200x200 array to guide object placement, with Perlin noise adding organic variation.
- Algorithms Implemented:
generateStarsFromArray, generateHazeFromArray, and generateNebulaeFromArray in galaxy.js mapped 2D array intensities to 3D positions, scaling coordinates to match galaxy dimensions.
- Logarithmic spiral equations in
utils.js used ARM_PITCH and ARMS for arm placement, modulated by Perlin noise in generateArray.js.
- Gaussian randomness (
gaussianRandom) was applied for core, bar, and halo distributions, with parameters like CORE_X_DIST and HALO_RADIUS controlling spread.
- Deviations from References: Unlike Shaker et al.’s focus on game-level generation, the approach prioritized astrophysical realism (e.g., bars, halos) and user-driven input via canvas, integrating real-time parameter tweaks.
- Unique Decisions:
- Canvas input was inverted (black=1, white=0) and downsampled to a 200x200 array to capture fine details, addressing initial mirroring issues by reversing x-coordinates.
- Perlin noise was added to
generateArray.js to enhance spiral arm variation, avoiding uniform patterns.
- Rationale: Canvas-based generation allowed user creativity, while Perlin noise and expanded parameters (e.g.,
ARM_PITCH, BAR_LENGTH) ensured realistic, tunable galaxy structures, aligning with astrophysical models.
Nebulae Object
- Techniques Used: Nebulae were introduced as layered sprites with additive blending, using a cloudy texture (
nebula.png) to mimic star-forming regions, as inspired by Real-Time Rendering (Akenine-Möller et al., 2018). Region-specific colors and scales were applied for visual distinction.
- Algorithms Implemented:
- The
Nebula class in nebula.js created multiple sprite layers with varying opacity and scale, using NEBULA_SCALE_MIN/MAX for size randomization.
generateNebulaeFromArray in galaxy.js placed nebulae based on 2D array intensities, with positions scaled to galaxy dimensions.
- Color variation was added per layer using random RGB adjustments, enhancing cloud-like depth.
- Deviations from References: Unlike Akenine-Möller et al.’s focus on complex particle systems, simpler sprite-based nebulae were used for performance, with layered sprites simulating depth instead of volumetric rendering.
- Unique Decisions:
- Three sprite layers were used per nebula, with outer layers having larger scales and lower opacity, creating a glowing, diffuse effect.
- Region-specific colors (blue for arms, magenta for core) were applied to align with astrophysical expectations.
- Rationale: Layered sprites balanced visual richness with browser-based rendering constraints, while region-based coloring enhanced realism by reflecting star-forming environments.
Custom Effects
- Techniques Used: Post-processing shaders were implemented via Three.js
EffectComposer, blending base, bloom, and overlay textures with artistic effects, as outlined in OpenGL Shading Language (Rost et al., 2009). Shaders included BlackWhite, Invert, RainbowCycle, and Heatmap, applied in the fragment stage.
- Algorithms Implemented:
BlackWhiteShader used a luminance dot product for grayscale conversion.
InvertShader flipped RGB values (1.0 - color).
RainbowCycleShader applied time-based hue cycling using sin and mod functions.
HeatmapShader mapped luminance to a blue-to-red thermal gradient.
- Shader switching was handled in
main.js by updating baseComposer.passes[1] based on SHADER_TYPE.
- Deviations from References: Unlike Rost et al.’s complex shader pipelines, simpler post-processing effects were implemented, focusing on 2D texture manipulation rather than 3D geometry alterations due to browser limitations.
- Unique Decisions:
- A dropdown in
index.html allowed real-time shader switching, integrated via index.js listeners updating config.SHADER_TYPE.
RainbowCycleShader used a time uniform for animated hue shifts, enhancing visual dynamism.
- Rationale: Simple fragment shaders ensured performance, while the dropdown provided an intuitive interface for artistic exploration, making the galaxy visually engaging without requiring scene regeneration.
Lighting Mode
- Techniques Used: The original galaxy generator we have makes use of pngs that are used as texture maps for sprite materials. The sprite materials then are passed through a Unreal Bloom post-processor that gives it a glow. So to remove the built in lighting, we made the LightingStars object a SphereGeometry with radius .5 to simulate a small dot star. A similar process is done for the LightingNebula class where we also make it a small SphereGeometry with radius 0.5 but we make it transparent to mirror the nebulae in the normal galaxy generation. We then simply made the LightingGalaxy to create instances of LightingStar and LightingNebula instead of star and nebula in the generation functions.
- Algorithms Implemented:
- To support switching between modes, we implemented clear_scene which iterates through all scene object, materials, and geometries and removes it.
- The main lighting adding algorithm we implemented in index.html is generate_light . This first calls clear_scene to get rid of all the auto illuminated objects. We then create a new instance of LightingGalaxy which generates all the non lit up stars and nebulae. Then from user input (or the default parameters) we create config.NUM_POINT_LIGHTS point lights whose intensity can be modified as well by the user.
- Then for each point light we perform these calculations
const sign_x = Math.random() < 0.5 ? 1 : -1;
const sign_y = Math.random() < 0.5 ? 1 : -1;
const pointLight = new THREE.PointLight(0xffffff, config.POINT_LIGHT_INTESITY, 0);
pointLight.position.set((Math.random() * config.OUTER_CORE_X_DIST) * sign_x,
(Math.random() * config.OUTER_CORE_Y_DIST) * sign_y,
0);
scene.add(pointLight);
- This essentially places each point light randomly based on the grid parameters and sign_x and sign_y determine if the x or y values or positive or negative respectively.
- Point Light visualizers are also added to the scene to show where exactly the light is coming from
- Finally we add a hemisphere light to add an ambient light that interpolates between two colors.
- References: I did not have a specific reference point for the lighting but here are some of the resources I used to assist in my understanding of ThreeJS lighting and geometries.
- Unique Decisions:
- Adding controllable parameters such as number of point lights, point light intensity, and hemisphere light intensity
- Opting for Point lights instead of Directional lights.
- Rationale:
- Adding controllable parameters highlights the importance and power of lighting within a scene if your objects aren’t automatically illuminated. It also helps the user understand how ambient lighting interacts with point lighting and the tradeoffs between how illuminated a scene is and the performance.
- When experimenting with what light to use, we found the Directional light to be unfit to illuminate our galaxy due to the sheer scale of our scene and how many lights we would have to add. The illumination the directional lights provided were also quite small if you zoomed out. Point lights better illuminated an area and as a result needed a lesser number of lights to add slight illumination to a scene.
GUI
- Techniques Used: To maintain a clean and organized look, extensive planning and deployment of JavaScript and CSS were required. Additionally, browser storage was utilized to persist the user’s drawing across different pages.
- Algorithms Implemented:
-
To preserve the live nature of the parameters, numerous event listeners were implemented to track any changes made to the settings.
-
Both JavaScript and CSS were employed in varying amounts to add animations and enhance the page’s professional feel.
- Deviations from References: The original version lacked a GUI beyond basic controls, so everything else had to be created from scratch. While the idea for the drawing feature was inspired by one of our references, no code was copied or directly referenced when developing it.
- Unique Decisions:
-
Instead of a long, scrollable menu taking up significant space over the rendered image, the menu was implemented as a dropdown.
-
On the drawing page, the background was originally a breathing gradient, but was later replaced with a GIF that better fit the overall theme.
- Rationale:With a large number of parameters, it’s easy for the screen to become cluttered and distract from the main purpose of the page. To address this, I chose a cleaner design where only essential elements remain visible at all times, with the rest hidden away. While prioritizing cleanliness can sometimes reduce intuitiveness, the layout was carefully structured so that users can easily find what they need without confusion.
Physically realistic rotation
- Techniques Used:
-
Differential rotation curve implemented in `galaxy.js` to mimic observed flat rotation curves, as described in Mistele et al. (2024).
-
Rotation applied to stars, haze, and nebulae using `deltaTime` for frame-rate independence.
- Algorithms Implemented:
-
Core Constants: Defines rotation parameters like velocity (v0 ~200 km/s scaled), core radius (r0 ~3 kpc), and max radius (rMax for halo).
-
Angular Velocity Calculation: A helper function getAngularVelocity(r) models a realistic galaxy rotation curve based on radius (r):
-
For r < r0 (core): Linear rise like solid-body rotation, omega = (v0 / r0) * (r / r0) – velocity increases with r for rigid spin.
-
For r0 ≤ r < rMax (disk): Flat curve, omega = v0 / r – constant tangential speed, common in spiral galaxies due to dark matter.
-
For r ≥ rMax (halo): Keplerian decline, omega = (v0 / rMax) * sqrt(rMax / r) – velocity drops like orbiting planets (v ~ 1/sqrt(r)).
-
Rotation Application: For each element (star, haze, nebula):
-
Computes r from x,y position (assuming z=0, disk plane).
-
Gets omega scaled by deltaTime for smooth, time-based rotation.
-
Uses rotation matrix (cos/sin) to update x,y: new_x = x*cos - y*sin; new_y = x*sin + y*cos – rotates point around origin by angle omega.
-
Copies updated position to the object's visual representation (e.g., in 3D engine like Three.js).
-
Element-Specific Handling:
-
Stars and haze: Simple position update per particle.
-
Nebulae: Updates main position, then rotates each sub-sprite individually for detailed structure.
-
Overall: mimics how galaxies spin differentially (not rigidly), preventing "winding" issues; inner parts rotate faster angularly, outer slower, creating realistic arms over time.
A description of problems encountered and how they were tackled
- Real-Time Ray Tracing Performance
- Problem: Attempting real-time ray tracing in the browser for rendering the galaxy scene, which included stars, haze, and nebulae as spheres, was prohibitively slow due to the high computational cost of per-pixel ray-sphere intersections across thousands of objects.
- Solution: Shifted to generating a static PNG render via the
renderRaytracePNG function in main.js, producing a 400x400 image by tracing rays once per pixel and downloading the result. This approach avoided continuous rendering, significantly reducing performance demands while maintaining high-quality output.
- Rationale: A static PNG render balanced visual fidelity with browser constraints, allowing users to obtain a high-quality image without requiring real-time interaction, aligning with the project's goal of accessible, browser-based visualization.
- Slow PNG Render
- Problem: The initial PNG render in
renderRaytracePNG was still slow, taking significant time to process due to the need to check ray intersections against all scene objects (stars, haze, nebulae) for each pixel, resulting in O(n) complexity per ray where n is the number of objects.
- Solution: Implemented a Bounding Volume Hierarchy (BVH) data structure in
main.js to accelerate ray tracing. The buildBVH function organized spheres into a hierarchical tree, reducing intersection tests to O(log n) by culling non-intersecting regions early, as inspired by Physically Based Rendering (Pharr et al., 2016).
- Rationale: BVH acceleration was chosen for its efficiency in handling large scenes, enabling faster rendering of the galaxy's complex geometry while maintaining accuracy, making the PNG output practical for user interaction.
- Loss of Fine Details in Canvas Input
- Problem: Fine details in user-drawn canvas inputs (e.g., text like "hello") were partially lost when processed into a 100x100 array, due to coarse resolution reducing pixel fidelity.
- Solution: Increased the array resolution to 200x200 in
drawCanvas.js and averaged pixel intensities per grid cell to capture more detail, improving the mapping of canvas drawings to star, haze, and nebula placements in galaxy.js.
- Rationale: Higher resolution and averaging preserved user-drawn patterns, enhancing the creative flexibility of the 2D input feature.
- Canvas Mirroring in 2D Input Generation
- Problem: Galaxy generation from user-drawn 2D canvas input in
drawCanvas.js produced mirrored results due to incorrect x-coordinate mapping when converting canvas pixels to 3D positions in galaxy.js.
- Solution: Modified the
processCanvas function in drawCanvas.js to reverse x-coordinates (px = Math.floor((width - 1 - x) * scaleX + dx)), ensuring correct orientation when mapping 2D array data to 3D galaxy coordinates.
- Rationale: Correcting the coordinate mapping ensured the galaxy accurately reflected user drawings, preserving intuitive control over the generated structure.
Lessons Learned
- Ray Tracing Feasibility in Browsers
- Attempting real-time ray tracing for a galaxy with thousands of objects (stars, haze, nebulae) revealed its impracticality in a browser due to the computational intensity of per-pixel ray-sphere intersections. This underscored the need to evaluate rendering techniques against hardware constraints, leading to a shift toward a static PNG render in
renderRaytracePNG to achieve high-quality output without real-time demands.
- BVH for Ray Tracing Acceleration
- Initial PNG rendering was slow due to linear intersection tests (O(n) per ray). Implementing a Bounding Volume Hierarchy (BVH) in
main.js reduced complexity to O(log n) by hierarchically culling non-intersecting objects, as per Physically Based Rendering (Pharr et al., 2016). This taught the critical role of spatial data structures in optimizing ray tracing for large scenes, balancing speed and accuracy.
- Rendering Pipeline Efficiency
- The multi-pass rendering pipeline in
main.js using EffectComposer (RenderPass, UnrealBloomPass, ShaderPass) highlighted the importance of layered rendering for visual effects like bloom and custom shaders. Understanding how to separate base, bloom, and overlay layers clarified the trade-offs between rasterization (fast, sprite-based) and ray tracing (slow, high-fidelity), guiding the choice of rasterization for interactive viewing and ray tracing for static output.
- Shader Integration in Post-Processing
- Implementing custom shaders (e.g., BlackWhite, RainbowCycle) in the fragment stage of
EffectComposer demonstrated how GPU-based post-processing can efficiently transform rendered textures without altering geometry. The need to manually update uniforms (e.g., time in RainbowCycleShader) emphasized the importance of managing dynamic inputs in shader pipelines for animated effects.
- Scene Representation for Ray Tracing
- Modeling stars, haze, and nebulae as spheres with varying radii in
renderRaytracePNG simplified intersection calculations but required careful handling of lighting (diffuse shading via dot product) to achieve realistic visuals. This highlighted the importance of geometric approximations in ray tracing to balance computational cost with visual fidelity, reinforcing the need for efficient data structures like BVH to handle complex scenes.
References
- Fundamentals of Computer Graphics (Shirley, P., et al., 2009). A K Peters/CRC Press.
- Physically Based Rendering (Pharr, M., et al., 2016). Morgan Kaufmann.
- Procedural Content Generation in Games (Shaker, N., et al., 2016). Springer.
- Real-Time Rendering (Akenine-Möller, T., et al., 2018). A K Peters/CRC Press.
- OpenGL Shading Language (Rost, R. J., et al., 2009). Addison-Wesley.
Team Member Contributions
- Sultan
- Found the original open source project; wrote, submitted, deployed the proposal; added procedural generation of galaxy from user-defined parameters; added post-processing effects (black&white, inverse, heatmap, etc.); added the nebulae object; wrote, deployed, submitted the graded milestone; enabled galaxy generation from user input with basic GUI; merged code from team members; implemented rendering with ray tracing and BVH; wrote & submitted final deliverable; implemented realistic rotation of the galaxy;
- Ryan
- Implemented lighting mode, created new galaxy, star, and nebulae classes to respond to lighting. Also added randomization to light positions and parameters to change the number of lights, point light intensity, and hemisphere light intensity.
- Ruben
- Implemented the full fleshed out Graphical User Interface. Organized the parameters and controls into a clean, easy to navigate format. Re-implemented drawing using a more streamlined process. Recorded/edited the videos.