1. HOME
  2. ブログ
  3. payday loan in usa
  4. For the July 8 I tried remapping ‘Unused Offer’ so you can ‘Accepted’ in the `previous_software

納入実績

Installation record

payday loan in usa

For the July 8 I tried remapping ‘Unused Offer’ so you can ‘Accepted’ in the `previous_software

For the July 8 I tried remapping ‘Unused Offer’ so you can ‘Accepted’ in the `previous_software

csv` but saw zero improvement in order to local Cv. I additionally experimented with carrying out aggregations founded merely with the Unused also offers and you can Terminated even offers, but watched zero increase in regional Curriculum vitae.

Atm withdrawals, installments) to find out if the customer is broadening Atm withdrawals due to the fact day continued, or if perhaps buyer is decreasing the lowest payment because day ran on the, an such like

I was reaching a wall surface. Toward July 13, We lower my understanding price so you’re able to 0.005, and you may my regional Curriculum vitae visited 0.7967. Anyone Lb is 0.797, additionally the private Pound is actually 0.795. It was the highest local Cv I happened to be capable of getting that have just one design.

Next model, I spent a whole lot day trying to adjust the hyperparameters here so there. I attempted reducing the studying speed, opting for most readily useful 700 otherwise 400 have, I attempted having fun with `method=dart` to practice, dropped particular articles, replaced certain viewpoints with NaN. My personal score never ever improved. In addition checked out dos,3,4,5,6,7,8 season aggregations, but not one helped.

Into the July 18 We authored an alternate dataset with increased keeps to try and improve my get. You’ll find it because of the pressing right here, in addition to code to produce it by the pressing right here.

Toward July 20 We grabbed the average from several designs you to was educated into the various other big date lengths getting aggregations and you will got societal Lb 0.801 and personal Lb 0.796. Used to do some more combines after this, and several got high towards individual Lb, but nothing actually defeat anyone Pound. I tried and Hereditary Programming provides, address encoding, changing hyperparameters, but absolutely nothing helped. I attempted by using the oriented-inside the `lightgbm.cv` to lso are-illustrate into the complete dataset and that failed to help both. I attempted raising the regularization since I was thinking that we got too many have however it did not let. I tried tuning `scale_pos_weight` and found which failed to assist; actually, sometimes increasing weight off non-confident advice manage enhance the local Cv more broadening lbs of confident examples (counter easy to use)!

I additionally idea of Bucks Funds and you may Consumer Money once the exact same, so i managed to eliminate a good amount of the huge cardinality

While this try going on, I found myself fooling to a great deal with Sensory Companies since the I had plans to include it as a fusion on my design to see if my personal score increased. I am pleased I did, since I provided certain neural sites on my cluster afterwards. I have to thank Andy Harless to possess guaranteeing everyone in the battle to grow Neural Communities, and his awesome so easy-to-pursue kernel you to definitely determined us to state, “Hello, I’m able to do this too!” He only utilized a feed forward neural community, but I had intends to use an entity inserted sensory network having a separate normalization program.

My personal higher private Lb score doing work by yourself was 0.79676. This should are entitled to me rank #247, good enough getting a silver medal nevertheless extremely respectable.

August thirteen We composed an alternate updated dataset that had a ton of brand new have that we are assured create just take me actually highest. Brand new dataset is obtainable by the pressing here, additionally the code generate it could be receive because of the pressing right here.

The latest featureset got has actually that i think had been very unique. It’s got categorical cardinality avoidance, conversion process away from bought groups to numerics, cosine/sine conversion process of one’s hr regarding application (so 0 is virtually 23), proportion involving the claimed money and you can average money for your business (if for example the said earnings is much large, maybe you are lying to really make it seem like your application is perfect!), money divided because of the overall area of home. We got the sum of the `AMT_ANNUITY` you pay out every month of effective earlier in the day applications, after which separated you to by your money, to find out if the proportion was adequate to adopt a unique financing. I grabbed velocities and accelerations of specific articles (age.g. This may reveal when the client try begin to rating brief towards money which expected to default. In addition tested velocities and you can accelerations regarding days past due and you may number overpaid/underpaid to see if they certainly were which have present trend. Unlike someone else, I imagined the fresh new `bureau_balance` desk was very beneficial. I lso are-mapped the fresh new `STATUS` column to numeric, erased all of the `C` rows (since they contains no additional recommendations, they certainly were simply spammy rows) and you may using this I was able to find out which bureau apps was in fact energetic, that happen to be defaulted on the, etc. In addition, it helped within the cardinality protection. It was bringing regional Cv out of 0.794 no matter if, so possibly We put loan places Morris aside continuously pointers. If i had longer, I’d not have less cardinality a whole lot and will have merely kept another beneficial have We written. Howver, they most likely helped too much to brand new variety of your own people stack.

  1. この記事へのコメントはありません。

  1. この記事へのトラックバックはありません。

関連記事