您现在的位置:主页 > 数据可视化 > 使用Stata进行Logistic回归分析实例分析

使用Stata进行Logistic回归分析实例分析

2017-06-19 15:47

使用Stata进行Logistic回归分析实例分析,一下是具体的操作,简洁明了,一目了然.大家可以看看

 use "C:\Stata12\2_data\002-胰腺炎.dta", clear

. sum

    Variable |       Obs        Mean    Std. Dev.       Min       Max
————-+——————————————————–
          id |       113    785515.7    53014.54     605046    833486
         sex |       113    .5486726    .4998419          0          1
         age |       113    59.06195    18.07933         17         90
         ldh |       113    433.5434    448.6421        2.9      2272
          cr |       113    106.3265     100.756         21        775
————-+——————————————————–
         abl |       113    34.45221    6.624105       17.9      51.2
        mods |       113    .2477876    .4336509          0          1
         pre |       113    .2477879    .3315311     .00382     .99995

. list

     +————————————————————+
     |     id   sex   age      ldh      cr    abl   mods      pre |
     |————————————————————|
  1. | 828966     0    65    299.3    47.1   34.4      1    .0614 |
  2. | 769948     1    40     2036   395.1   25.9      1   .99972 |
  3. | 691896     1    78      881    89.4   39.1      1   .17659 |
  4. | 679641     1    79     2250   360.2   26.2      1   .99972 |
  5. | 766834     1    79      300     775   22.4      1   .99995 |
     |————————————————————|
  6. | 746872     1    76      410     177   21.1      1   .86829 |
  7. | 711428     1    58   2047.4     276   27.1      1   .99814 |
  8. | 699401     0    62      633   235.4   24.7      1   .93165 |
  9. | 789971     0    79      225      71   30.2      1    .1432 |
 10. | 788979     1    21     1149      37     21      1   .85097 |
     |————————————————————|
 11. | 780270     1    59      881     310     34      1   .92918 |
 12. | 775535     0    77      500     318     28      1   .94542 |
 13. | 650668     1    57     1248     180     29      1   .92791 |
 14. | 697919     1    84      345     210   32.7      1   .51026 |
 15. | 699401     0    62      633     235   24.7      1   .93128 |
     |————————————————————|
 16. | 699767     0    76    460.5     157     26      1   .69305 |
 17. | 728235     0    77      359     159   35.4      1   .23909 |
 18. | 734791     0    84      305     138   17.9      1   .84005 |
 19. | 738421     1    56     1487     306     27      1   .99519 |
 20. | 746872     1    76     1211     205   27.2      1   .95914 |
     |————————————————————|
 21. | 763940     1    39      407      60   33.4      1   .11039 |
 22. | 822913     0    41     1100      38   28.9      1   .54136 |
 23. | 816293     1    77      506    92.7   40.9      1    .0593 |
 24. | 820032     1    75      320     107   26.4      1   .41857 |
 25. | 821686     1    45      823      63   18.8      1   .84678 |
     |————————————————————|
 26. | 831350     0    48   1402.3     318   28.2      1   .99376 |
 27. | 829526     1    65     2272     383   21.6      1   .99992 |
 28. | 830224     0    76    489.7      71   36.2      1   .09599 |
 29. | 685639     0    80      245     123   36.2      0   .10833 |
 30. | 798034     0    40      230      21   24.3      0   .19822 |
     |————————————————————|
 31. | 700759     0    46      264      51   30.9      0   .10826 |
 32. | 616791     0    51      293      38   28.8      0   .13795 |
 33. | 805107     1    79      168      52   28.7      0   .12727 |
 34. | 805110     0    46      168      45   33.2      0   .05406 |
 35. | 804010     1    78      224      56   28.2      0   .16314 |
     |————————————————————|
 36. | 801367     1    53      175      78     45      0   .01031 |
 37. | 802216     0    76      290      87     32      0    .1504 |
 38. | 803383     0    32      117      66   38.8      0   .02345 |
 39. | 795567     0    44      147      58   39.7      0   .01915 |
 40. | 794845     0    64      203      51   46.9      0    .0053 |
     |————————————————————|
 41. | 794119     1    39      189      84   41.6      0   .02164 |
 42. | 794338     0    88      658     205   34.4      0   .60721 |
 43. | 794131     0    60      210      46   41.3      0   .01409 |
 44. | 794202     0    25      555      52   31.8      0   .17736 |
 45. | 803426     0    57      264      58   41.8      0   .01739 |
     |————————————————————|
 46. | 806737     1    61      214      79     41      0   .02392 |
 47. | 806539     1    65      181      70   36.5      0   .04376 |
 48. | 806537     1    63      454      80   33.2      0   .16177 |
 49. | 806023     1    56      319      67   38.3      0   .04241 |
 50. | 802369     0    68     1033      88   32.2      0   .52563 |
     |————————————————————|
 51. | 802028     0    82      320      64   31.6      0   .12873 |
 52. | 801515     1    35      171      73   37.2      0   .03931 |
 53. | 801928     0    70      449      59   37.7      0   .05758 |
 54. | 800184     0    85      278      55   35.2      0   .05649 |
 55. | 801605     0    70      2.9      54   37.9      0   .01765 |
     |————————————————————|
 56. | 801603     0    35      354      30   37.9      0   .02971 |
 57. | 801307     1    86      138      78   34.8      0   .05947 |
 58. | 800230     0    77      225      53   36.1      0   .04133 |
 59. | 794964     1    66      323      95   33.1      0   .14949 |
 60. | 795620     1    43      146      87   36.5      0    .0508 |
     |————————————————————|
 61. | 795252     0    48      205      66   33.1      0   .07946 |
 62. | 795526     1    48      174      94     41      0   .02676 |
 63. | 792978     0    58      170      72   35.2      0   .05513 |
 64. | 794217     1    57      270      58   33.9      0   .07237 |
 65. | 773257     0    76      160      63   35.2      0   .04763 |
     |————————————————————|
 66. | 792542     1    49      194      57   32.7      0   .07364 |
 67. | 792833     1    47      158      94   34.5      0   .08124 |
 68. | 800538     1    66      217      50   36.6      0   .03558 |
 69. | 789694     1    85      310      76   27.7      0   .26112 |
 70. | 799492     0    72       29      40     29      0   .07581 |
     |————————————————————|
 71. | 793578     0    72      186      71     31      0   .11556 |
 72. | 791232     0    77      144      61   34.8      0   .04788 |
 73. | 788760     1    57      145      90   47.6      0   .00703 |
 74. | 799116     1    44      227      61   37.3      0   .03743 |
 75. | 802375     1    49      279      63   45.3      0    .0102 |
     |————————————————————|
 76. | 784337     1    32      148      64   35.6      0   .04371 |
 77. | 783947     1    31      269      76   40.8      0   .02719 |
 78. | 783842     1    29      654      74     36      0   .14782 |
 79. | 783501     1    69      236      74     44      0   .01361 |
 80. | 783198     1    84      203      60   37.7      0   .03243 |
     |————————————————————|
 81. | 605046     1    35     1194     204   38.1      0   .74518 |
 82. | 610769     0    55      982      50   30.4      0   .44136 |
 83. | 619327     1    17      485      83   41.8      0   .04217 |
 84. | 650544     0    74      258     212     31      0   .54198 |
 85. | 767680     0    70    290.3      80   39.7      0   .03689 |
     |————————————————————|
 86. | 829694     1    28      265      73   51.2      0   .00382 |
 87. | 829106     0    59      337      48   35.5      0   .05603 |
 88. | 828745     1    38      218      74   43.8      0    .0135 |
 89. | 828666     1    89      498     101   39.1      0   .08864 |
 90. | 828263     1    50      187      74   28.5      0   .17874 |
     |————————————————————|
 91. | 827393     1    77      186      69   42.3      0   .01531 |
 92. | 827369     1    62      242      90   37.9      0   .05191 |
 93. | 827156     0    25      282      54   41.7      0    .0175 |
 94. | 827034     0    27      144      49     30      0   .09364 |
 95. | 826948     0    34      124      48     42      0   .01031 |
     |————————————————————|
 96. | 826817     1    34      202      70   37.7      0   .03716 |
 97. | 826696     1    58      303      70     33      0   .10633 |
 98. | 825045     1    63      234      61   30.5      0   .12284 |
 99. | 824940     0    80      271      71   40.9      0   .02502 |
100. | 824605     1    38      157      87   47.7      0   .00681 |
     |————————————————————|
101. | 823381     1    70      209      74     31      0   .12624 |
102. | 833486     0    72      168      94   27.6      0   .24636 |
103. | 832515     0    90      193      45   30.8      0   .08678 |
104. | 832070     1    50      219      80   35.9      0   .06098 |
105. | 831928     1    37      131      79   43.5      0   .01236 |
     |————————————————————|
106. | 831566     0    62      179      61     41      0   .01704 |
107. | 831124     0    65      235      45   35.6      0   .04146 |
108. | 830946     1    55      115      71   44.9      0   .00819 |
109. | 830745     1    45      134      78   39.7      0   .02456 |
110. | 830581     1    67      369      73   39.2      0   .04423 |
     |————————————————————|
111. | 830523     0    63      967      81   34.8      0   .34388 |
112. | 829833     0    75      184      89   39.7      0   .03233 |
113. | 828503     1    29      662      96   26.4      0   .59103 |
     +————————————————————+

. logit mods sex

Iteration 0:   log likelihood =  -63.26774 
Iteration 1:   log likelihood = -63.009407 
Iteration 2:   log likelihood = -63.008974 
Iteration 3:   log likelihood = -63.008974 

Logistic regression                               Number of obs         113
                                                 LR chi2(1)      =       0.52
                                                 Prob > chi2     =     0.4719
Log likelihood = -63.008974                       Pseudo R2       =     0.0041

——————————————————————————
        mods |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
————-+—————————————————————-
         sex |    .317535   .4437959     0.72   0.474     -.552289   1.187359
       _cons |  -1.290984   .3404542    -3.79   0.000    -1.958262   -.6237061
——————————————————————————

. logit mods age

Iteration 0:   log likelihood =  -63.26774 
Iteration 1:   log likelihood = -61.410619 
Iteration 2:   log likelihood = -61.384146 
Iteration 3:   log likelihood = -61.384131 
Iteration 4:   log likelihood = -61.384131 

Logistic regression                               Number of obs         113
                                                 LR chi2(1)      =       3.77
                                                 Prob > chi2     =     0.0523
Log likelihood = -61.384131                       Pseudo R2       =     0.0298

——————————————————————————
        mods |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
————-+—————————————————————-
         age |   .0246326   .0131484     1.87   0.061    -.0011379    .050403
       _cons |  -2.614525   .8575939    -3.05   0.002    -4.295378   -.9336716
——————————————————————————

. logit mods ldh

Iteration 0:   log likelihood =  -63.26774 
Iteration 1:   log likelihood = -43.576347 
Iteration 2:   log likelihood = -43.455543 
Iteration 3:   log likelihood = -43.455308 
Iteration 4:   log likelihood = -43.455308 

Logistic regression                               Number of obs         113
                                                 LR chi2(1)      =      39.62
                                                 Prob > chi2     =     0.0000
Log likelihood = -43.455308                       Pseudo R2       =     0.3132

——————————————————————————
        mods |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
————-+—————————————————————-
         ldh |   .0040724   .0009141     4.45   0.000     .0022808   .0058641
       _cons |  -3.006031   .4828876    -6.23   0.000    -3.952473   -2.059589
——————————————————————————

. logit mods cr

Iteration 0:   log likelihood =  -63.26774 
Iteration 1:   log likelihood =  -41.24542 
Iteration 2:   log likelihood = -41.119546 
Iteration 3:   log likelihood = -41.117441 
Iteration 4:   log likelihood = -41.117441 

Logistic regression                               Number of obs         113
                                                 LR chi2(1)      =      44.30
                                                 Prob > chi2     =     0.0000
Log likelihood = -41.117441                       Pseudo R2       =     0.3501

——————————————————————————
        mods |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
————-+—————————————————————-
          cr |   .0225873   .0050643     4.46   0.000     .0126615   .0325131
       _cons |  -3.578768   .5798729    -6.17   0.000    -4.715298   -2.442238
——————————————————————————

. logit mods abl

Iteration 0:   log likelihood =  -63.26774 
Iteration 1:   log likelihood = -45.365845 
Iteration 2:   log likelihood = -43.453786 
Iteration 3:   log likelihood = -43.421114 
Iteration 4:   log likelihood = -43.421108 
Iteration 5:   log likelihood = -43.421108 

Logistic regression                               Number of obs         113
                                                 LR chi2(1)      =      39.69
                                                 Prob > chi2     =     0.0000
Log likelihood = -43.421108                       Pseudo R2       =     0.3137

——————————————————————————
        mods |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
————-+—————————————————————-
         abl |  -.2767854   .0579555    -4.78   0.000    -.3903761  -.1631947
       _cons |   7.821677   1.815949     4.31   0.000     4.262483   11.38087
——————————————————————————

. logit mods ldh cr abl

Iteration 0:   log likelihood =  -63.26774 
Iteration 1:   log likelihood = -31.249401 
Iteration 2:   log likelihood = -30.061031 
Iteration 3:   log likelihood =  -30.03929 
Iteration 4:   log likelihood = -30.039258 
Iteration 5:   log likelihood = -30.039258 

Logistic regression                               Number of obs         113
                                                 LR chi2(3)      =      66.46
                                                 Prob > chi2     =     0.0000
Log likelihood = -30.039258                       Pseudo R2       =     0.5252

——————————————————————————
        mods |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
————-+—————————————————————-
         ldh |   .0024992    .001073     2.33   0.020     .0003962   .0046021
          cr |   .0143511   .0057272     2.51   0.012     .0031261   .0255761
         abl |  -.1858638   .0647696    -2.87   0.004    -.3128099  -.0589177
       _cons |    2.24286   2.246818     1.00   0.318    -2.160823   6.646544
——————————————————————————

. lfit,g(10)

Logistic model for mods, goodness-of-fit test

  (Table collapsed on quantiles of estimated probabilities)

       number of observations =       113
             number of groups =        10
      Hosmer-Lemeshow chi2(8) =         5.93
                  Prob > chi2 =         0.6549

. lstat

Logistic model for mods

              ——– True ——–
Classified |         D            ~D  |      Total
———–+————————–+———–
     +     |        20             5  |         25
     –     |         8            80  |         88
———–+————————–+———–
   Total   |        28            85  |        113

Classified + if predicted Pr(D) >= .5
True D defined as mods != 0
————————————————–
Sensitivity                     Pr( +| D)   71.43%
Specificity                     Pr( -|~D)   94.12%
Positive predictive value       Pr( D| +)   80.00%
Negative predictive value       Pr(~D| -)   90.91%
————————————————–
False + rate for true ~D        Pr( +|~D)    5.88%
False – rate for true D         Pr( -| D)   28.57%
False + rate for classified +   Pr(~D| +)   20.00%
False – rate for classified –   Pr( D| -)    9.09%
————————————————–
Correctly classified                        88.50%
————————————————–

. predict pre
(option pr assumed; Pr(mods))

. roctab mods pre

                      ROC                    -Asymptotic Normal–
           Obs       Area     Std. Err.      [95% Conf. Interval]
         ——————————————————–
           113     0.9273       0.0268        0.87485     0.97977

. roctab mods pre,g

使用Stata进行Logistic回归分析实例分析


. lsens

使用Stata进行Logistic回归分析实例分析

. roccomp mods  pre ldh cr abl

                              ROC                    -Asymptotic Normal–
                   Obs       Area     Std. Err.      [95% Conf. Interval]
————————————————————————-
pre                113     0.9273       0.0268        0.87485     0.97977
ldh                113     0.9034       0.0285        0.84752     0.95921
cr                 113     0.7998       0.0633        0.67580     0.92378
abl                113     0.1483       0.0444        0.06136     0.23528
————————————————————————-
Ho: area(pre) = area(ldh) = area(cr) = area(abl)
    chi2(3) =   189.39       Prob>chi2 =   0.0000

. rocgold mods  pre ldh cr abl

——————————————————————————-
                       ROC                                           Bonferroni
                      Area     Std. Err.       chi2    df  Pr>chi2     Pr>chi2
——————————————————————————-
pre (standard)      0.9273       0.0268
ldh                 0.9034       0.0285      0.6873     1   0.4071      1.0000
cr                  0.7998       0.0633      4.9712     1   0.0258      0.0773
abl                 0.1483       0.0444    135.4836     1   0.0000      0.0000
——————————————————————————-

 . roctab mods pre,d

Detailed report of sensitivity and specificity
——————————————————————————
                                           Correctly
Cutpoint      Sensitivity   Specificity   Classified          LR+          LR-
——————————————————————————
( >= .00382 )     100.00%         0.00%       24.78%       1.0000    
( >= .0053 )      100.00%         1.18%       25.66%       1.0119       0.0000
( >= .00681 )     100.00%         2.35%       26.55%       1.0241       0.0000
( >= .00703 )     100.00%         3.53%       27.43%       1.0366       0.0000
( >= .00819 )     100.00%         4.71%       28.32%       1.0494       0.0000
( >= .0102 )      100.00%         5.88%       29.20%       1.0625       0.0000
( >= .01031 )     100.00%         7.06%       30.09%       1.0759       0.0000
( >= .01236 )     100.00%         9.41%       31.86%       1.1039       0.0000
( >= .0135 )      100.00%        10.59%       32.74%       1.1184       0.0000
( >= .01361 )     100.00%        11.76%       33.63%       1.1333       0.0000
( >= .01409 )     100.00%        12.94%       34.51%       1.1486       0.0000
( >= .01531 )     100.00%        14.12%       35.40%       1.1644       0.0000
( >= .01704 )     100.00%        15.29%       36.28%       1.1806       0.0000
( >= .01739 )     100.00%        16.47%       37.17%       1.1972       0.0000
( >= .0175 )      100.00%        17.65%       38.05%       1.2143       0.0000
( >= .01765 )     100.00%        18.82%       38.94%       1.2319       0.0000
( >= .01915 )     100.00%        20.00%       39.82%       1.2500       0.0000
( >= .02164 )     100.00%        21.18%       40.71%       1.2687       0.0000
( >= .02345 )     100.00%        22.35%       41.59%       1.2879       0.0000
( >= .02392 )     100.00%        23.53%       42.48%       1.3077       0.0000
( >= .02456 )     100.00%        24.71%       43.36%       1.3281       0.0000
( >= .02502 )     100.00%        25.88%       44.25%       1.3492       0.0000
( >= .02676 )     100.00%        27.06%       45.13%       1.3710       0.0000
( >= .02719 )     100.00%        28.24%       46.02%       1.3934       0.0000
( >= .02971 )     100.00%        29.41%       46.90%       1.4167       0.0000
( >= .03233 )     100.00%        30.59%       47.79%       1.4407       0.0000
( >= .03243 )     100.00%        31.76%       48.67%       1.4655       0.0000
( >= .03558 )     100.00%        32.94%       49.56%       1.4912       0.0000
( >= .03689 )     100.00%        34.12%       50.44%       1.5179       0.0000
( >= .03716 )     100.00%        35.29%       51.33%       1.5455       0.0000
( >= .03743 )     100.00%        36.47%       52.21%       1.5741       0.0000
( >= .03931 )     100.00%        37.65%       53.10%       1.6038       0.0000
( >= .04133 )     100.00%        38.82%       53.98%       1.6346       0.0000
( >= .04146 )     100.00%        40.00%       54.87%       1.6667       0.0000
( >= .04217 )     100.00%        41.18%       55.75%       1.7000       0.0000
( >= .04241 )     100.00%        42.35%       56.64%       1.7347       0.0000
( >= .04371 )     100.00%        43.53%       57.52%       1.7708       0.0000
( >= .04376 )     100.00%        44.71%       58.41%       1.8085       0.0000
( >= .04423 )     100.00%        45.88%       59.29%       1.8478       0.0000
( >= .04763 )     100.00%        47.06%       60.18%       1.8889       0.0000
( >= .04788 )     100.00%        48.24%       61.06%       1.9318       0.0000
( >= .0508 )      100.00%        49.41%       61.95%       1.9767       0.0000
( >= .05191 )     100.00%        50.59%       62.83%       2.0238       0.0000
( >= .05406 )     100.00%        51.76%       63.72%       2.0732       0.0000
( >= .05513 )     100.00%        52.94%       64.60%       2.1250       0.0000
( >= .05603 )     100.00%        54.12%       65.49%       2.1795       0.0000
( >= .05649 )     100.00%        55.29%       66.37%       2.2368       0.0000
( >= .05758 )     100.00%        56.47%       67.26%       2.2973       0.0000
( >= .0593 )      100.00%        57.65%       68.14%       2.3611       0.0000
( >= .05947 )      96.43%        57.65%       67.26%       2.2768       0.0620
( >= .06098 )      96.43%        58.82%       68.14%       2.3418       0.0607
( >= .0614 )       96.43%        60.00%       69.03%       2.4107       0.0595
( >= .07237 )      92.86%        60.00%       68.14%       2.3214       0.1190
( >= .07364 )      92.86%        61.18%       69.03%       2.3918       0.1168
( >= .07581 )      92.86%        62.35%       69.91%       2.4665       0.1146
( >= .07946 )      92.86%        63.53%       70.80%       2.5461       0.1124
( >= .08124 )      92.86%        64.71%       71.68%       2.6310       0.1104
( >= .08678 )      92.86%        65.88%       72.57%       2.7217       0.1084
( >= .08864 )      92.86%        67.06%       73.45%       2.8189       0.1065
( >= .09364 )      92.86%        68.24%       74.34%       2.9233       0.1047
( >= .09599 )      92.86%        69.41%       75.22%       3.0357       0.1029
( >= .10633 )      89.29%        69.41%       74.34%       2.9190       0.1544
( >= .10826 )      89.29%        70.59%       75.22%       3.0357       0.1518
( >= .10833 )      89.29%        71.76%       76.11%       3.1622       0.1493
( >= .11039 )      89.29%        72.94%       76.99%       3.2997       0.1469
( >= .11556 )      85.71%        72.94%       76.11%       3.1677       0.1959
( >= .12284 )      85.71%        74.12%       76.99%       3.3117       0.1927
( >= .12624 )      85.71%        75.29%       77.88%       3.4694       0.1897
( >= .12727 )      85.71%        76.47%       78.76%       3.6429       0.1868
( >= .12873 )      85.71%        77.65%       79.65%       3.8346       0.1840
( >= .13795 )      85.71%        78.82%       80.53%       4.0476       0.1812
( >= .1432 )       85.71%        80.00%       81.42%       4.2857       0.1786
( >= .14782 )      82.14%        80.00%       80.53%       4.1071       0.2232
( >= .14949 )      82.14%        81.18%       81.42%       4.3638       0.2200
( >= .1504 )       82.14%        82.35%       82.30%       4.6548       0.2168
( >= .16177 )      82.14%        83.53%       83.19%       4.9872       0.2138
( >= .16314 )      82.14%        84.71%       84.07%       5.3709       0.2108
( >= .17659 )      82.14%        85.88%       84.96%       5.8185       0.2079
( >= .17736 )      78.57%        85.88%       84.07%       5.5655       0.2495
( >= .17874 )      78.57%        87.06%       84.96%       6.0714       0.2461
( >= .19822 )      78.57%        88.24%       85.84%       6.6786       0.2429
( >= .23909 )      78.57%        89.41%       86.73%       7.4206       0.2397
( >= .24636 )      75.00%        89.41%       85.84%       7.0833       0.2796
( >= .26112 )      75.00%        90.59%       86.73%       7.9687       0.2760
( >= .34388 )      75.00%        91.76%       87.61%       9.1071       0.2724
( >= .41857 )      75.00%        92.94%       88.50%      10.6250       0.2690
( >= .44136 )      71.43%        92.94%       87.61%      10.1190       0.3074
( >= .51026 )      71.43%        94.12%       88.50%      12.1429       0.3036
( >= .52563 )      67.86%        94.12%       87.61%      11.5357       0.3415
( >= .54136 )      67.86%        95.29%       88.50%      14.4197       0.3373
( >= .54198 )      64.29%        95.29%       87.61%      13.6607       0.3748
( >= .59103 )      64.29%        96.47%       88.50%      18.2143       0.3702
( >= .60721 )      64.29%        97.65%       89.38%      27.3214       0.3657
( >= .69305 )      64.29%        98.82%       90.27%      54.6430       0.3614
( >= .74518 )      60.71%        98.82%       89.38%      51.6073       0.3975
( >= .84005 )      60.71%       100.00%       90.27%                    0.3929
( >= .84678 )      57.14%       100.00%       89.38%                    0.4286
( >= .85097 )      53.57%       100.00%       88.50%                    0.4643
( >= .86829 )      50.00%       100.00%       87.61%                    0.5000
( >= .92791 )      46.43%       100.00%       86.73%                    0.5357
( >= .92918 )      42.86%       100.00%       85.84%                    0.5714
( >= .93128 )      39.29%       100.00%       84.96%                    0.6071
( >= .93165 )      35.71%       100.00%       84.07%                    0.6429
( >= .94542 )      32.14%       100.00%       83.19%                    0.6786
( >= .95914 )      28.57%       100.00%       82.30%                    0.7143
( >= .99376 )      25.00%       100.00%       81.42%                    0.7500
( >= .99519 )      21.43%       100.00%       80.53%                    0.7857
( >= .99814 )      17.86%       100.00%       79.65%                    0.8214
( >= .99972 )      14.29%       100.00%       78.76%                    0.8571
( >= .99992 )       7.14%       100.00%       76.99%                    0.9286
( >= .99995 )       3.57%       100.00%       76.11%                    0.9643
( >  .99995 )       0.00%       100.00%       75.22%                    1.0000
——————————————————————————


精选的每一篇文章均来源于公开网络,仅供学习使用,不会用于任何商业用途,文章版权归原作者所有,如果侵犯到原作者的权益,请您与我们联系删除或者授权事宜,。转载网站文章请注明原文章作者,否则产生的任何版权纠纷与无关。

    学交互 | 使用Tableau制作的可参考交互图