Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to iterate by reusing results of a previous step in current step? #602

Open
sudhindra95 opened this issue Nov 2, 2023 · 9 comments
Open

Comments

@sudhindra95
Copy link

sudhindra95 commented Nov 2, 2023

Hi community, I have the following JSON input file:
dummy_input.json (Note: I want to include N key-values in this file, where N~ 1000)

[
  {
    "key": "key1",
    "value": "value1"
  },
  {
    "key": "key2",
    "value": "value2"
  },
  {
    "key": "key3",
    "value": "value3"
  }
]

I have the following Zerocode scenario file:

{
  "scenarioName": "Zerocode Loop Example",
  "steps": [
    {
      "name": "send_request",
      "url": "kafka-topic:my_topic",
      "operation": "produce",
      "request": {
        "recordType": "JSON",
        "records": "${0}"
      },
      "verify": {
        "status": "Ok",
        "recordMetadata": "$NOT.NULL"
      }
    },
    {
      "name": "query_table",
      "url": "com.demo.cassandra.CassandraConnector",
      "method": "getResult",
      "request": {
        "class": "com.demo.core.pojo.MyTable",
        "query": "select * from mytable where value='${$.send_request.request.records[0-n].value}'" //how do I iterate over records from 0-n as a loop by passing one value at a time?
      },
      "assertions": {
        "resultData": "$NOT.NULL"
      }
    }
  ],
  "parameterized": {
    "csvSource":[
      "${JSON.FILE:tests/input-files/dummy_input.json}"
    ]
  }
}

The result produced by the step send_request is :

{
  "recordType" : "JSON",
  "records" : [ {
    "key" : "key1",
    "value" : "value1"
  }, {
    "key" : "key2",
    "value" : "value2"
  }, {
    "key" : "key3",
    "value" : "value3"
  } ]
} 

I want to pass each value in this result (value1, value2, value3) one at a time to the query that I am using in query_table step. How to achieve this?
I kindly request the author and Zerocode community to assist me in this as this feature is very much needed to test a big data sample

@authorjapps
Copy link
Owner

@sudhindra95 , see here how Parameterized is used.
https://zerocode-tdd-docs.pages.dev/assertions/Parameterized-Testing-From-CSV-rows#test-scenario

I am afraid, it's supported at scenario level, not at step level. But double check please.

But you can implement and raise a PR if that's the requirement for your usecase and if it helps to solve your usecase.

@sudhindra95
Copy link
Author

Hello @authorjapps, I am aware of usage of parameterized. However, in my usage the data is long and the use case requires me to place data in a JSON file as I have mentioned in the issue.

@nirmalchandra
Copy link
Collaborator

@sudhindra95 ,I see you have already placed the dummy_input.json content which is an JSON array already in a file.
What's the problem you're facing?
Then, what you want in your usecase to be solved?

@sudhindra95
Copy link
Author

@nirmalchandra , the use case that I have is to iterate over the data and send it's contents as input to some other step one at a time.
For instance, in my question I have 3 key-value JSON pairs (in actual use case there are n pairs).

  1. I am producing these inputs to a kafka topic in the send_request step.
  2. In the second step (query_table), I want to use the value part of each record that I had produced to kafka topic in the previous step by sending as a query parameter.
    As I want to send the values as a query parameter, I need to send it one at a time which needs to iterate over the data.
    @authorjapps mentioned that I can leverage Parameterized block for iteration. However, in my use case the data is long and placing it in the Parameterized block is not feasible.
    I need some feature which can support this iteration.

@authorjapps
Copy link
Owner

authorjapps commented Nov 18, 2023

@sudhindra95, still trying to get more clarity about your understanding as well as requirements.
In the below payload, KafkaClient is going to put the records in sequential manner:
i.e.
assume if you had topic-1 and 10 partitions, then:
=> key1, value1 ---> partition 0 (example only)
=> key2, value2 ---> partition 9 (example only)
=> key3, value3 ---> partition 5 (example only)

{
  "recordType" : "JSON",
  "records" : [ {
    "key" : "key1",
    "value" : "value1"
  }, {
    "key" : "key2",
    "value" : "value2"
  }, {
    "key" : "key3",
    "value" : "value3"
  } ]
} 

1st part:
This means, it's already happening what you wanted to achieve.


2nd part:
please explain what you want to do in the assertions?

Also please comment if the above understanding of the problem is not correct.

@sudhindra95
Copy link
Author

@authorjapps,

  1. The main use-case is, as I produce the JSON records to an input Kafka topic, there is a Flink job which will perform some operations and the Flink job will write some data to a Cassandra table.

  2. I want to send each value in the input field to my custom CassandraConnector Java method one at a time which will query the Cassandra table to check whether the value has been entered by the Flink job or not.

PS: The second part that you are referring to is the structure of the input data that I am sending to Kafka. I don't have to do any validation here.

@sudhindra95
Copy link
Author

@authorjapps , could you please provide any updates on this? This is a very desirable feature for my org where we want to test a large number of records

@authorjapps
Copy link
Owner

@authorjapps , could you please provide any updates on this? This is a very desirable feature for my org where we want to test a large number of records

@sudhindra95 , Can you please ping me in Slack around 6pm UK time or bit later(if possible)?
I will try to take some time out to get to the bottom of your scenario.
Hoping, if that would help!

@sudhindra95
Copy link
Author

@authorjapps Have pinged you on Slack

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants