mongodb - Performance issue with JSON input -
i loading mysql table mongodb source through kettle. mongodb table has more 4 million records , when run kettle job takes 17 hours finish first time load. incremental load takes more hour.i tried increasing commit size , giving more memory job, still performance not improving. think json
input step takes long time parse data , hence slow. have these steps in transformation
- mongodb input step
- json input
- strings cut
- if field value null
- concat fields
- select values
- table output.
same 4 million records when extracted postgre way more fast mongodb. there way can improve performance? please me.
thanks, deepthi
run multiple copies of step. sounds have mongo input json input step parse json results right? use 4 or 8 copies of json input step ( or more depending on cpu's) , it'll speed up.
alternatively need parse full json, maybe can extract data via regex or something.
Comments
Post a Comment