0:00:00.000,0:00:04.170 This is Tom from Coherent Logic and in 0:00:02.429,0:00:07.200 this demo I will show you how to acquire 0:00:04.170,0:00:11.040 OpenFIGI data in Spark using the CMR 0:00:07.200,0:00:13.380 API. The CMR API extends the OpenFIGI 0:00:11.040,0:00:15.450 Client so that the resultant JSON can be 0:00:13.380,0:00:18.570 converted directly into a Spark dataset. 0:00:15.450,0:00:22.289 On the right hand side I will quickly go 0:00:18.570,0:00:25.170 over a java example. In this example we 0:00:22.289,0:00:27.240 have a query builder and we can see 0:00:25.170,0:00:30.119 above in the comments where it needs to 0:00:27.240,0:00:33.870 send data to. It's going to POST to this 0:00:30.119,0:00:38.430 URL it has a content type header set to 0:00:33.870,0:00:42.719 text/json and it has a body set to the 0:00:38.430,0:00:45.539 following data: it has a ID type and ID 0:00:42.719,0:00:48.059 value and an exchange code. In our 0:00:45.539,0:00:51.360 example the query builder has a request 0:00:48.059,0:00:54.180 body, it has a new mapping entry with an 0:00:51.360,0:00:57.300 ID type and ID value, and an exchange 0:00:54.180,0:00:59.190 code set. Once we've done that we can 0:00:57.300,0:01:01.289 execute a method called doGetAsData 0:00:59.190,0:01:04.530 and that will return the Data to us. 0:01:01.289,0:01:07.799 These values the URL that's being posted 0:01:04.530,0:01:10.590 to and the header content type are set 0:01:07.799,0:01:13.170 behind the scenes. I've already executed 0:01:10.590,0:01:18.330 this example and we can take a look at 0:01:13.170,0:01:19.799 the resultant data here. So now we can 0:01:18.330,0:01:24.960 take a look at and see what this looks 0:01:19.799,0:01:26.430 like. In Spark I will enter paste mode so 0:01:24.960,0:01:29.729 in our example you can see that we've 0:01:26.430,0:01:33.570 made: we've added some imports, we have a 0:01:29.729,0:01:35.939 new instance of CMR, and then we use CMR 0:01:33.570,0:01:38.579 to get an instance of the query builder 0:01:35.939,0:01:41.400 which has been decorated with a new 0:01:38.579,0:01:43.799 method called doGetAsDataDataset 0:01:41.400,0:01:48.210 and we pass in an instance of the Spark 0:01:43.799,0:01:49.950 session. Once this returns we'll have an 0:01:48.210,0:01:55.670 instance of Data and we can take a look 0:01:49.950,0:01:55.670 inside that and I'll expand this for us 0:01:59.410,0:02:03.620 and that's what's been returned, which is 0:02:01.910,0:02:06.620 the same as the example that we saw 0:02:03.620,0:02:08.600 previously. So this concludes our 0:02:06.620,0:02:10.820 demonstration. If you liked this video 0:02:08.600,0:02:13.250 please give it a thumbs up and feel free to 0:02:10.820,0:02:15.080 leave comments and questions and I will 0:02:13.250,0:02:17.710 get you a response as soon as I can. 0:02:15.080,0:02:17.710 Bye.