11-03-2022 12:26 PM
Hello All,
I was wondering if anyone has integrated JumpCloud Directory Insight in Sumo Logic and could share any tips.
There is one way of integration and that is using aws serverless application.
If anyone has done this before using different API collector and would like to share their experience it would be appreciated.
Thank you all in advance.
Regards,
Charmi
Solved! Go to Solution.
11-08-2022 06:42 PM
You are most welcome Charmi.
So since the S3 bucket has the events as data already (can you pls double check?), the next thing will be letting Sumo Logic hock up with that exact bucket to have access to the data, the steps you can find from the link i mentioned above. I will paste it here again.
https://help.sumologic.com/docs/send-data/hosted-collectors/amazon-aws/aws-s3-source/
11-15-2022 04:07 AM
Hi Charmi,
If you are familiar with PowerShell, you can use our PowerShell SDK to retrieve Directory Insights data (it's JSON), and then push these to any 3rd party tool you would like, using their format. Either one-by-one, array, batch, etc.
Basically, your script should get the data, and then push it to the other side, while keeping some sort of state so you don't loose duplicate or loose information.
Useful Links:
Hope this helps to inspire you on how to build this.
Cheers,
Idan.
11-07-2022 02:47 PM
Hi Charmi,
I have experience with JumpCloud Directory Insights data and pushing it to a SIEM (and did it before). Basically, our Directory Insights provides you back with a JSON response. You can then push that JSON to your SIEM.
What scripting language are you comfortable with?
Thanks,
Idan.
11-14-2022 02:23 PM
Hello Idan,
Is there is still a possible way of using scripting language if we don't want to use s3 bucket? Can you please let me know?
Thank you,
Charmi Patel
11-15-2022 04:07 AM
Hi Charmi,
If you are familiar with PowerShell, you can use our PowerShell SDK to retrieve Directory Insights data (it's JSON), and then push these to any 3rd party tool you would like, using their format. Either one-by-one, array, batch, etc.
Basically, your script should get the data, and then push it to the other side, while keeping some sort of state so you don't loose duplicate or loose information.
Useful Links:
Hope this helps to inspire you on how to build this.
Cheers,
Idan.
11-07-2022 07:18 PM
Hi Charmi,
To add-on Idan's points, it seems Sumo logic is able to source data from S3 bucket.
So the whole thing might look like this:
once you setup our serverless app on aws, and we get the data (in JSON) captured, then you can setup the data source (S3 bucket) in Sumo Logic to start the event streaming.
Let us know if that is the case and feel free to continue the conversation here.
11-08-2022 03:13 PM
Hello Idan and Shawn,
Thank you for the response.
I do know the steps on creating an AWS Serverless App for Jump cloud logs. Once the App is created it will create S3 bucket and AWS lambda functions in JSON file.
It is quite confusing, what is the process of integrating that logs in Sumo logic. Do I have to install the AWS Lambda or S3 bucket in Sumo Logic?
Thank you
Charmi
11-08-2022 06:42 PM
You are most welcome Charmi.
So since the S3 bucket has the events as data already (can you pls double check?), the next thing will be letting Sumo Logic hock up with that exact bucket to have access to the data, the steps you can find from the link i mentioned above. I will paste it here again.
https://help.sumologic.com/docs/send-data/hosted-collectors/amazon-aws/aws-s3-source/
11-09-2022 09:25 AM
Thank you Shawn,
I wanted to confirm the steps once s3 bucket and lambda is set up. I will look at the documentation. Thank you for the help.
11-09-2022 06:01 PM
No worries Charmi, 🙏
03-08-2023 09:42 PM
Hey Charmi,
I have tried sumo logic myself and documented the steps here, thought might be useful for you.
New to the site? Take a look at these additional resources:
Ready to join us? You can register here.