# Creating Custom Skills in Pymirokai Pymirokai allows you to create **skills** to enhance its capabilities according to your needs. --- ## How to Create a Custom Skill ### 1. Use the `@skill` Decorator The `@skill` decorator is required to register the function as a valid skill. ### 2. Define Access Level Specify who can use the skill with the `access_level` parameter. ### 3. Define if the skill can be triggered by the LLM or not To allow the llm to automatically trigger the skill to answer the user request, set to `True` Example: ```python can_be_triggered_by_llm=True, ``` ### 4 Describe the Skill for Natural Language Interaction To allow natural triggering of the skill, define `verbal_descriptions` with example phrases in multiple languages. Example: ```python verbal_descriptions={ "en": ["find rhymes", "words that rhyme", "rhyme finder"], "fr": ["trouver des rimes", "mots qui riment", "rime"], } ``` ### 5. Define Input Parameters Each parameter must be described so the LLM understands what values to provide. Example: ```python parameters=[ ParameterDescription(name="word", description="The word to find rhymes for."), ParameterDescription(name="max_results", description="Maximum number of rhymes to retrieve."), ] ``` ### 6. Use Proper Function Signatures - Include type annotations for **all** parameters. - If interacting with Pymirokai's built-in functions, add `robot` as a parameter. Correct: ```python async def fetch_rhymes(robot: Robot, word: str, max_results: int = 5, timeout: int = 5) -> dict[str, Any]: ``` Incorrect (missing type annotations): ```python async def fetch_rhymes(word, max_results=5, timeout=5): # Incorrect ``` ### 7. Interacting with the Robot If your skill interacts with **Pymirokai functionalities**, include the `robot` parameter: ```python async def fetch_rhymes(robot: Robot, word: str, max_results: int = 5, timeout: int = 5) -> dict[str, Any]: ``` This allows using **robot actions** such as: ```python await robot.wave().completed() # Make the robot wave await robot.say("Here are some rhymes!") # Make the robot speak ``` ### 8. Returning the Response A skill can return: - Nothing - A **string** - A **list** - A **dictionary** - Or anything else that is JSON-serializable However, if you want the **LLM to reformulate the response**, **you must return a dictionary with the key `"llm_output"`**. ✅ **Correct Example (LLM can process and reformulate):** ```python return {"llm_output": "Words that rhyme with 'fun': sun, run, gun, bun, pun."} ``` ❌ **Incorrect Example for LLM: (LLM won't reformulate)** ```python return "Words that rhyme with 'fun': sun, run, gun, bun, pun." # Just a string ``` --- ## Example: Fetch Rhymes Skill Here is a complete example of a skill that finds **rhyming words** for a given input. ```{mermaid} graph TD; A["User request"] --> B["Skill receives input"]; B --> C["Fetch rhymes from API"]; C -->|"Success"| D["Extract API response"]; C -->|"Failure"| E["Handle error and output"]; D --> F["Format response"]; F --> G["Robot announces rhymes"]; G --> H["Return structured response"]; E --> H; ``` ```python import requests from pymirokai.decorators.skill import skill, ParameterDescription from pymirokai.enums import AccessLevel, BaseEmotion @skill( access_level=AccessLevel.USER, # Defines access level for the skill can_be_triggered_by_llm=True, # Defines if the skill can be triggered by the llm verbal_descriptions={ "en": ["find rhymes", "words that rhyme", "rhyme finder"], "fr": ["trouver des rimes", "mots qui riment", "rime"], }, parameters=[ ParameterDescription(name="word", description="The word to find rhymes for."), ParameterDescription(name="max_results", description="Maximum number of rhymes to retrieve."), ], ) async def fetch_rhymes(robot: Robot, word: str, max_results: int = 5, timeout: int = 5) -> dict[str, Any]: """Fetch words that rhyme with the given input word. Args: robot (Robot): The robot instance to interact with. word (str): The word to find rhymes for. max_results (int, optional): The maximum number of rhyming words to retrieve. Defaults to 5. timeout (int, optional): The timeout duration for the API request in seconds. Defaults to 5. Returns: dict: A dictionary containing the response output for the robot to announce. """ try: # Fetch rhymes from the Datamuse API url = f"https://api.datamuse.com/words?rel_rhy={word}&max={max_results}" response = requests.get(url, timeout=timeout) response.raise_for_status() rhyme_data = response.json() if rhyme_data: rhymes = [entry["word"] for entry in rhyme_data] answer = f"Words that rhyme with '{word}': {', '.join(rhymes)}" else: answer = f"No rhymes found for '{word}'." except requests.RequestException: answer = "Couldn't fetch rhymes at this moment." await robot.play_animation(BaseEmotion.HAPPY).completed() return {"llm_output": answer} # if you want the LLM to reformulate the answer ``` ## Run skill_manager on your computer - In your simulation virtual environment, run: ```bash start_skills_manager -i robot_ip -k api_key -d ~/skills ``` It will create by default a directory at `~/skills` where you can copy/past your skills, they will be detected automatically if the skill manager is running. ```bash start_skills_manager [-h] [-i IP] [-k API_KEY] [-d UPLOAD_DIR] options: -h, --help show this help message and exit -i IP, --ip IP Set the IP of the robot you want to connect to. -k API_KEY, --api-key API_KEY Set the API key of the robot. -d UPLOAD_DIR, --upload-dir UPLOAD_DIR Path to the skill directory. ``` --- ## Calling a Custom Skill Once you've defined a custom skill using the `@skill` decorator, you can trigger it in two main ways: 1. **By voice command**, if you've configured a verbal trigger for the skill. 2. **Programmatically**, using the robot API—just like any other skill provided by `pymirokai`. ### Example Suppose you've defined a custom skill as follows: ```python @skill(...) async def my_custom_skill(robot, arg_1: str, arg_2: int): ... ``` You can execute this skill from your Python code using the `robot._mission()` method. This gives you full control over its lifecycle, including start, completion, and cancellation: ```python mission = robot._mission("my_custom_skill", arg_1="arg_1", arg_2=42) await mission.started() await mission.completed() ``` This interface returns a `Mission` object that lets you: * Wait for the skill to start (`started()`) * Wait for it to finish (`completed()`) * Cancel it if needed (`cancel()` or `cancel_and_complete()`) --- ## Auto-Start Skills The `@skill` decorator also supports an `auto_start` flag, which allows a skill to run **automatically** as soon as the skill manager starts. This is particularly useful for: * Background monitoring tasks * State synchronization * Idle behaviors or default routines ### Defining an Auto-Start Skill Here's how to create one: ```python @skill(auto_start=True) async def background_behavior(robot): ... ``` When the skill manager launches (either locally or on the robot), this skill will start automatically—no manual call required. Auto-start skills are ideal for persistent, background processes that should always be active as long as the system is running. > 📌 More implementation details and best practices for `auto-start` skills are available in the [Background missions](background_missions.md) guide. --- These enhancements to the skill system provide a flexible way to design both interactive and background behaviors, making your robot truly autonomous and adaptive. ## Final Notes - Ensure **all arguments have type annotations**. - If using `robot`, it must be included as a parameter. - Properly define **verbal descriptions** and **parameter descriptions** for better integration.