Viewing a response to: @eneismijmich/fuzzy-evolutionary-and-neurocomputing-neural-networks-part-2
There is an important ommision, the RELU function that has been used with an overwhelming breaktrough the introduction of deep learning for image recognition. Relu (rectified linear unit) has the following shape _____/ Thank you and everyone feel free to discuss, I'm implementing neural network to an android application right now.
author | tocode |
---|---|
permlink | re-eneismijmich-fuzzy-evolutionary-and-neurocomputing-neural-networks-part-2-20160909t184344370z |
category | science |
json_metadata | {"tags":["science"]} |
created | 2016-09-09 18:43:45 |
last_update | 2016-09-09 18:43:45 |
depth | 1 |
children | 0 |
last_payout | 2016-10-09 18:59:18 |
cashout_time | 1969-12-31 23:59:59 |
total_payout_value | 0.000 HBD |
curator_payout_value | 0.000 HBD |
pending_payout_value | 0.000 HBD |
promoted | 0.000 HBD |
body_length | 334 |
author_reputation | 1,201,551,387,295 |
root_title | "Fuzzy, Evolutionary and Neurocomputing - Neural Networks (part 2)" |
beneficiaries | [] |
max_accepted_payout | 1,000,000.000 HBD |
percent_hbd | 10,000 |
post_id | 1,187,451 |
net_rshares | 12,339,963,893 |
author_curate_reward | "" |
voter | weight | wgt% | rshares | pct | time |
---|---|---|---|---|---|
tocode | 0 | 12,339,963,893 | 100% |