Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ If applicable, add screenshots to help explain your problem.

**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Browser [e.g. Chrome, Safari]
- Version [e.g. 22]

**Additional context**
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ export WATSONX_APIKEY="your-api-key"
export WATSONX_PROJECT_ID="your-project-id"
```

If you use Replicate:
If you use [Replicate](https://replicate.com/):
```bash
export REPLICATE_API_TOKEN="your-token"
```
Expand All @@ -123,7 +123,7 @@ VSCode setup for syntax highlighting and validation:

### Variable Definition & Template Usage

In this example we use external content imput.yaml and watonsx as a LLM provider.
In this example we use external content _data.yaml_ and watsonx as an LLM provider.

```yaml
description: Template with variables
Expand Down
6 changes: 3 additions & 3 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,8 @@ In order to run these examples, you need to create a free account
on Replicate, get an API key and store it in the environment variable:
- `REPLICATE_API_TOKEN`

In order to use foundation models hosted on [Watsonx](https://www.ibm.com/watsonx) via LiteLLM, you need a WatsonX account (a free plan is available) and set up the following environment variables:
- `WATSONX_URL`, the API url (set to `https://{region}.ml.cloud.ibm.com`) of your WatsonX instance. The region can be found by clicking in the upper right corner of the Watsonx dashboard (for example a valid region is `us-south` ot `eu-gb`).
In order to use foundation models hosted on [watsonx](https://www.ibm.com/watsonx) via LiteLLM, you need a WatsonX account (a free plan is available) and set up the following environment variables:
- `WATSONX_URL`, the API url (set to `https://{region}.ml.cloud.ibm.com`) of your WatsonX instance. The region can be found by clicking in the upper right corner of the watsonx dashboard (for example a valid region is `us-south` ot `eu-gb`).
- `WATSONX_APIKEY`, the API key (see information on [key creation](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui#create_user_key))
- `WATSONX_PROJECT_ID`, the project hosting the resources (see information about [project creation](https://www.ibm.com/docs/en/watsonx/saas?topic=projects-creating-project) and [finding project ID](https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-project-id.html?context=wx)).

Expand Down Expand Up @@ -133,7 +133,7 @@ text:
temperature: 0
```

The `description` field is a description for the program. Field `text` contains a list of either strings or *block*s which together form the text to be produced. In this example, the text starts with the string `"Hello\n"` followed by a block that calls out to a model. In this case, it is model with id `replicate/ibm-granite/granite-20b-code-instruct-8k` on Replicate, via LiteLLM, with the indicated parameters: the stop sequence is `!`, and temperature set to `0`. Stop sequences are provided with a comman separated list of strings. The input to the model call is everything that has been produced so far in the program (here `"Hello\n"`).
The `description` field is a description for the program. Field `text` contains a list of either strings or *block*s which together form the text to be produced. In this example, the text starts with the string `"Hello\n"` followed by a block that calls out to a model. In this case, it is model with id `replicate/ibm-granite/granite-20b-code-instruct-8k` on Replicate, via LiteLLM, with the indicated parameters: the stop sequence is `!`, and temperature set to `0`. Stop sequences are provided with a comma separated list of strings. The input to the model call is everything that has been produced so far in the program (here `"Hello\n"`).

When we execute this program using the PDL interpreter:

Expand Down
24 changes: 7 additions & 17 deletions docs/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ In PDL, we can declaratively chain models together as in the following example (
--8<-- "./examples/tutorial/model_chaining.pdl"
```

In this program, the first call is to a granite model to complete the sentence `Hello, world!`. The following block in the document prints out the sentence: `Translate this to French`. The final line of the program takes the entire document produced so far and passes it as input to the granite multilingual model. Notice that the input passed to this model is the document up to that point, represented as a conversation. This makes it easy to chain models together and continue building on previous interactions.
In this program, the first call is to a Granite model to complete the sentence `Hello, world!`. The following block in the document prints out the sentence: `Translate this to French`. The final line of the program takes the entire document produced so far and passes it as input to the Granite multilingual model. Notice that the input passed to this model is the document up to that point, represented as a conversation. This makes it easy to chain models together and continue building on previous interactions.

When we execute this program, we obtain:

Expand Down Expand Up @@ -151,7 +151,7 @@ The translation of 'I love Paris!' to French is 'J'aime Paris!'.
The translation of 'I love Madrid!' to Spanish is 'Me encanta Madrid!'.
```

A function only contributes to the output document when it is called. So the definition itself results in `""`. When we call a function, we implicitly pass the current background context, and this is used as input to model calls inside the function body. In the above example, since the `input` field is omitted, the entire document produced at that point is passed as input to the granite model.
A function only contributes to the output document when it is called. So the definition itself results in `""`. When we call a function, we implicitly pass the current background context, and this is used as input to model calls inside the function body. In the above example, since the `input` field is omitted, the entire document produced at that point is passed as input to the Granite model.

To reset the context when calling a function, we can pass the special argument: `pdl_context: []`.

Expand Down Expand Up @@ -223,12 +223,12 @@ An `object` constructs an object:
```
object:
name: Bob
job: mananger
job: manager
```

This results in the following output:
```
{"name": "Bob", "job": "mananger"}
{"name": "Bob", "job": "manager"}
```

Each value in the object can be any PDL block, and the result is presented as an object.
Expand Down Expand Up @@ -328,7 +328,7 @@ PDL programs can contain calls to REST APIs with Python code. Consider a simple
--8<-- "./examples/tutorial/calling_apis.pdl"
```

In this program, we first define a query about the weather in some location (assigned to variable `QUERY`). The next block is a call to a granite model with few-shot examples to extract the location, which we assign to variable `LOCATION`. The next block makes an API call with Python (mocked in this example). Here the `LOCATION` is appended to the `url`. The result is a JSON object, which may be hard to interpret for a human user. So we make a final call to an LLM to interpret the JSON in terms of weather. Notice that many blocks have `contribute` set to `[]` to hide intermediate results.
In this program, we first define a query about the weather in some location (assigned to variable `QUERY`). The next block is a call to a Granite model with few-shot examples to extract the location, which we assign to variable `LOCATION`. The next block makes an API call with Python (mocked in this example). Here the `LOCATION` is appended to the `url`. The result is a JSON object, which may be hard to interpret for a human user. So we make a final call to an LLM to interpret the JSON in terms of weather. Notice that many blocks have `contribute` set to `[]` to hide intermediate results.


## Data Block
Expand Down Expand Up @@ -538,7 +538,7 @@ role: user
```

In PDL, any block can be adorned with a `role` field indicating the role for that block. These are high-level annotations
that help to make programs more portable accross different models. If the role of a block is not specified (except for model blocks that have `assistant` role),
that help to make programs more portable across different models. If the role of a block is not specified (except for model blocks that have `assistant` role),
then the role is inherited from the surrounding block. So in the above example, we only need to specify `role: user` at the top level (this is the default, so it doesn't
need to be specified explicitly).

Expand Down Expand Up @@ -607,7 +607,7 @@ the examples below:
- `{list: {int: {minimum: 0}}}`: a list of integers satisfying the indicated constraints
- `[{int: {minimum: 0}}]`: same as above
- `{list: {minItems: 1, int: {}}}`, a list satisfying the indicated constraints
- `{obj: {latitude: float, longitude: float}}`: an ibject with fields `latitude` and `longitude`
- `{obj: {latitude: float, longitude: float}}`: an object with fields `latitude` and `longitude`
- `{latitude: float, longitude: float}`: same as above
- `{obj: {question: str, answer: str, context: {optional: str}}}`: an object with an optional field
- `{question: str, answer: str, context: {optional: str}}`: same as above
Expand Down Expand Up @@ -931,13 +931,3 @@ Output:
Several lines of text, with some "quotes" of various 'types'. Escapes (like \n) don't do anything.
Newlines can be added by leaving a blank line. Additional leading whitespace is ignored.
```










2 changes: 1 addition & 1 deletion examples/cldk/cldk-assistant.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ text:
from cldk import CLDK
from cldk.models.java.models import *

# Initialize the Codellm-DevKit object with the project directory, langguage, and backend.
# Initialize the Codellm-DevKit object with the project directory, language, and backend.
cldk = CLDK("java")
cldk_state = cldk.analysis(
project_path="${ project }", # Change this to the path of the project you want to analyze.
Expand Down
3 changes: 1 addition & 2 deletions examples/weather/weather.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ text:
- lang: python
code: |
import requests
#result = requests.get('https://api.weatherapi.com/v1/current.json?key==XYZ=${ LOCATION }')
#result = requests.get('https://api.weatherapi.com/v1/current.json?key==XYZ=${ LOCATION }').text
#Mock result:
result = '{"location": {"name": "Madrid", "region": "Madrid", "country": "Spain", "lat": 40.4, "lon": -3.6833, "tz_id": "Europe/Madrid", "localtime_epoch": 1732543839, "localtime": "2024-11-25 15:10"}, "current": {"last_updated_epoch": 1732543200, "last_updated": "2024-11-25 15:00", "temp_c": 14.4, "temp_f": 57.9, "is_day": 1, "condition": {"text": "Partly cloudy", "icon": "//cdn.weatherapi.com/weather/64x64/day/116.png", "code": 1003}, "wind_mph": 13.2, "wind_kph": 21.2, "wind_degree": 265, "wind_dir": "W", "pressure_mb": 1017.0, "pressure_in": 30.03, "precip_mm": 0.01, "precip_in": 0.0, "humidity": 77, "cloud": 75, "feelslike_c": 12.8, "feelslike_f": 55.1, "windchill_c": 13.0, "windchill_f": 55.4, "heatindex_c": 14.5, "heatindex_f": 58.2, "dewpoint_c": 7.3, "dewpoint_f": 45.2, "vis_km": 10.0, "vis_miles": 6.0, "uv": 1.4, "gust_mph": 15.2, "gust_kph": 24.4}}'
def: WEATHER
Expand All @@ -29,4 +29,3 @@ text:
input: |
Explain the weather from the following JSON:
${ WEATHER }

Loading