sample
turbot/flowpipe-samples/summarize-github-issue-with-openai
Get Involved
Version
Pipeline: Summarize GitHub Issue with OpenAI
Summarize a GitHub issue with OpenAI.
Run the pipeline
To run this pipeline from your terminal:
flowpipe pipeline run summarize_github_issue_with_openai.pipeline.summarize_github_issue_with_openai \ --arg 'github_repository_owner=<string>' \ --arg 'github_repository_name=<string>' \ --arg 'github_issue_number=<number>' \ --arg 'openai_system_content=<string>' \ --arg 'openai_model=<string>' \ --arg 'openai_max_tokens=<number>' \ --arg 'openai_temperature=<number>' \ --arg 'slack_channel=<string>'
Use this pipeline
To call this pipeline from your pipeline, use a step:
step "pipeline" "step_name" { pipeline = summarize_github_issue_with_openai.pipeline.summarize_github_issue_with_openai args = { github_repository_owner = <string> github_repository_name = <string> github_issue_number = <number> openai_system_content = <string> openai_model = <string> openai_max_tokens = <number> openai_temperature = <number> slack_channel = <string> }}
Params
Name | Type | Required | Description | Default |
---|---|---|---|---|
github_conn | connection.github | Yes | Name of Github connection to use. If not provided, the default Github connection will be used. | connection.github.default |
slack_conn | connection.slack | Yes | Name of Slack connection to use. If not provided, the default Slack connection will be used. | connection.slack.default |
openai_conn | connection.openai | Yes | Name of OpenAI connection to use. If not provided, the default OpenAI connection will be used. | connection.openai.default |
github_repository_owner | string | Yes | The organization or user name. | - |
github_repository_name | string | Yes | The name of the repository. | - |
github_issue_number | number | Yes | The number of the issue. | - |
openai_system_content | string | Yes | The role of the messages author. System in this case. | - |
openai_model | string | Yes | ID of the model to use. See the [model endpoint compatibility](https://platform.openai.com/docs/models/model-endpoint-compatibility) table for details on which models work with the Chat API. | - |
openai_max_tokens | number | Yes | The maximum number of tokens to generate in the chat completion. | - |
openai_temperature | number | Yes | What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. | - |
slack_channel | string | Yes | Channel, private group, or IM channel to send message to. | - |
Outputs
Name | Description |
---|---|
openai_response |