🤖 AI Robot Behavior Simulator

Task-to-decision flow visualization, step-by-step explanation of robot logic, condition-based branching (if/else decisions), human-readable AI reasoning representation, scenario simulation without live robot deployment

Robot Configuration

Simulation Results

Status
Ready to simulate
Task-to-Decision Flow Visualization
▶ Start: Initialize sensors
🔍 Decision: Check distance sensor
Condition: distance < 0.5
✅ True → Execute: stop
❌ False → Check next condition
🔍 Decision: Check person detection
Condition: object_detected == "person" && confidence > 0.8
✅ True → Execute: greet
⏹ End: Cycle complete
Step-by-Step Explanation
1 Sensor Reading: Distance = 0.8m, Object detected = "person" (92% confidence), Battery = 85%
2 Evaluating Rule 1: distance < 0.5? Condition
0.8 < 0.5 = FALSE → Skip 'stop' action
3 Evaluating Rule 2: distance >= 0.5 && distance < 1.0? Condition
0.8 >= 0.5 && 0.8 < 1.0 = TRUE → Execute 'slow_down'
4 Executing Action: Action slow_down - Reduce speed to 0.3 m/s
Human-Readable AI Reasoning
🤔 Reasoning: Robot detected a person at 0.8m with 92% confidence. Since distance is within 1m but more than 0.5m, the robot slows down as a safety precaution. The person is recognized, but no greeting is triggered because the primary rule for greeting requires specific conditions that may not be met in this context. Battery level is sufficient (85%), so no need to return to charger.
Condition-Based Branching
IF distance < 0.5 → STOP
ELSE IF distance < 1.0 → SLOW DOWN
IF person detected → GREET
ELSE → CONTINUE

📖 How To Use AI Robot Behavior Simulator

📝 Step 1: Configure Sensor Data

Enter sensor readings in JSON format. Include distance, light, sound, temperature, battery level, object detection, and confidence values.

⚙️ Step 2: Define Behavior Rules

Create rules with conditions and corresponding actions. Each rule should have a condition (JavaScript expression) and an action (robot command).

🎮 Step 3: Select Robot Type

Choose your robot platform: Wheeled, Quadruped, Humanoid, or Drone. This affects available actions and movement patterns.

🔄 Step 4: Choose Scenario

Quick-load predefined scenarios: Obstacle Avoidance, Line Following, Pick & Place, or Search & Rescue.

🚀 Step 5: Simulate

Click "Simulate Robot Behavior" to see the decision flow, step-by-step logic, and human-readable reasoning.

💡 Pro Tips

• Use the Sample button to load an example configuration

• Conditions support: <, >, <=, >=, ==, &&, ||, and sensor values

• Actions can be: stop, move_forward, turn_left, turn_right, slow_down, speed_up, greet, pick_up, place, return_to_charger, etc.

• The simulator evaluates rules in order, executing the first matching condition

🔍 Example

# Sensor Input: { "distance": 0.3, "light": 200, "battery": 15, "object_detected": "wall" } # Rules: [ { "condition": "distance < 0.5", "action": "stop", "description": "Emergency stop" }, { "condition": "battery < 20", "action": "return_to_charger", "description": "Low battery" } ] # Result: ✅ Condition 1: distance < 0.5 = TRUE → Execute 'stop' ⏹ Robot stops due to obstacle

❓ Frequently Asked Questions

What sensor data formats are supported?
The simulator accepts JSON objects with key-value pairs. Common keys include: distance (meters), light (lux), sound (dB), temperature (°C), battery (%), object_detected (string), confidence (0-1), position (x,y), orientation (degrees), and custom sensor values.
How are behavior rules evaluated?
Rules are evaluated in order from top to bottom. The first rule whose condition evaluates to true triggers its action, and subsequent rules are ignored for that cycle. This allows for priority-based decision making (e.g., safety rules first).
Can I simulate complex decision trees?
Yes! You can create nested conditions using && (AND) and || (OR) operators. For example: "distance < 1.0 && object_detected == 'person' && confidence > 0.9" triggers only when all conditions are met. You can also chain multiple rules for complex behaviors.
Is this simulator accurate for real robots?
This simulator is designed for behavior logic validation and educational purposes. It accurately models decision flows, condition checking, and action selection based on sensor inputs. For physical robot deployment, additional factors like motor control, physics, and real-time constraints need consideration.
Can I save my robot configurations?
Yes! Use the Export button to download your current sensor data and rules as a JSON file. You can later load these configurations or modify them in any text editor. The Copy button copies the simulation results to clipboard for documentation.
What robot actions are available?
Common actions: stop, move_forward, move_backward, turn_left, turn_right, slow_down, speed_up, greet, pick_up, place, return_to_charger, rotate, beep, flash_light, wait, and custom actions. Available actions depend on selected robot type.
How does the AI reasoning work?
The AI reasoning engine analyzes each rule condition against current sensor data, explains why conditions pass or fail, and provides human-readable explanations of the decision-making process. This helps understand robot behavior without needing to interpret complex code.
Is my data secure?
Absolutely! All simulation processing happens entirely in your browser. Your sensor data and robot rules are never sent to any server. You can disconnect from the internet and the simulator will still work perfectly.