[Evolvis-commits] r410: Solved conflicts arising from 2.6 import/merge. [CB] ↵
mirabilos at evolvis.org
mirabilos at evolvis.org
Thu Feb 25 14:21:52 CET 2010
Author: mirabilos
Date: 2010-02-25 13:21:51 +0000 (Thu, 25 Feb 2010)
New Revision: 410
Added:
trunk/gforge_base/evolvisforge/gforge/utils/include_2_5.pl
trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif_2_5.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump_2_5.pl
Removed:
trunk/gforge_base/evolvisforge/gforge/utils/cvs1/cvs_history_parse.pl
trunk/gforge_base/evolvisforge/gforge/utils/cvs1/run_span.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_allagg.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_filetotal.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grouptotal.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grp.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/remission_filemaint.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/stats_agr_filerelease.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/stats_ftp_logparse.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/stats_http_logparse.pl
trunk/gforge_base/evolvisforge/gforge/utils/download/stats_logparse.sh
trunk/gforge_base/evolvisforge/gforge/utils/download/stats_nightly_filerelease.pl
trunk/gforge_base/evolvisforge/gforge/utils/projects-fileserver/
trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_cvs_history.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_prepare.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_projects_nightly.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_site_nightly.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/run_span.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/stats_nightly.sh
Modified:
trunk/gforge_base/evolvisforge/
trunk/gforge_base/evolvisforge/gforge/utils/include.pl
trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif.pl
trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump.pl
Log:
Solved conflicts arising from 2.6 import/merge. [CB]
Property changes on: trunk/gforge_base/evolvisforge
___________________________________________________________________
Name: bzr:revision-info
- timestamp: 2001-11-02 11:40:24.000000000 +0000
committer: lo-lan-do
+ timestamp: 2001-11-02 11:43:06.000000000 +0000
committer: cbayle
Name: bzr:file-ids
-
+ gforge/utils/include.pl 2 at 9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk%2Fgforge%2Futils%2Finclude.pl
gforge/utils/include_2_5.pl 189 at 9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk%2Fgforge%2Futils%2Finclude_2_5.pl
gforge/utils/sql2ldif.pl 2 at 9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk%2Fgforge%2Futils%2Fsql2ldif.pl
gforge/utils/sql2ldif_2_5.pl 189 at 9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk%2Fgforge%2Futils%2Fsql2ldif_2_5.pl
gforge/utils/underworld-dummy/ssh_dump.pl 2 at 9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk%2Fgforge%2Futils%2Funderworld-dummy%2Fssh_dump.pl
gforge/utils/underworld-dummy/ssh_dump_2_5.pl 189 at 9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk%2Fgforge%2Futils%2Funderworld-dummy%2Fssh_dump_2_5.pl
Name: bzr:revision-id:v4
- 1 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:1
2 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:2
3 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:7
4 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:9
5 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:10
6 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:11
7 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:12
8 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:13
9 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:14
10 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:15
11 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:16
12 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:17
13 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:18
14 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:19
15 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:20
16 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:21
17 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:22
18 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:23
19 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:24
20 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:25
21 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:26
22 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:27
23 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:28
24 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:29
25 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:30
26 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:31
27 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:32
28 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:33
29 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:34
30 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:35
31 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:36
32 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:37
33 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:38
34 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:39
35 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:40
36 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:41
37 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:42
38 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:43
39 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:44
40 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:45
41 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:46
42 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:47
43 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:48
44 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:49
45 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:50
46 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:51
47 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:52
48 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:53
49 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:54
50 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:55
51 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:56
52 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:57
53 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:58
54 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:59
55 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:60
56 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:61
57 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:62
58 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:63
59 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:64
60 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:65
61 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:66
62 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:67
63 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:68
64 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:69
65 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:70
66 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:71
67 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:72
68 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:73
69 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:74
70 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:75
71 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:76
72 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:77
73 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:78
74 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:79
75 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:80
76 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:81
77 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:82
78 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:83
79 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:84
80 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:85
81 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:86
82 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:87
83 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:88
84 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:89
85 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:90
86 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:91
87 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:92
88 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:93
89 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:94
90 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:95
91 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:96
92 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:97
93 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:98
94 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:99
95 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:100
96 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:101
97 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:102
98 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:103
99 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:104
100 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:105
101 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:106
102 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:107
103 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:108
104 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:109
105 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:110
106 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:111
107 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:112
108 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:113
109 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:114
110 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:115
111 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:116
112 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:117
113 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:118
114 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:119
115 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:120
116 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:121
117 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:122
118 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:123
119 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:124
120 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:125
121 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:126
122 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:127
123 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:128
124 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:129
125 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:130
126 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:131
127 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:132
128 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:133
129 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:134
130 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:135
131 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:136
132 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:137
133 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:138
134 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:139
135 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:140
136 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:141
137 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:142
138 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:143
139 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:144
140 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:145
141 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:146
142 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:147
143 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:148
144 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:149
145 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:150
146 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:151
147 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:152
148 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:153
149 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:154
150 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:155
151 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:156
152 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:157
153 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:158
154 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:159
155 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:160
156 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:161
157 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:162
158 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:163
159 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:164
160 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:165
161 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:166
162 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:167
163 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:168
164 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:169
165 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:170
166 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:172
167 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:173
168 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:174
169 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:175
170 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:176
171 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:180
172 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:184
173 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:186
174 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:187
175 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:188
+ 1 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:1
2 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:2
3 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:7
4 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:9
5 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:10
6 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:11
7 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:12
8 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:13
9 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:14
10 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:15
11 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:16
12 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:17
13 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:18
14 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:19
15 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:20
16 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:21
17 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:22
18 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:23
19 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:24
20 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:25
21 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:26
22 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:27
23 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:28
24 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:29
25 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:30
26 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:31
27 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:32
28 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:33
29 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:34
30 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:35
31 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:36
32 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:37
33 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:38
34 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:39
35 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:40
36 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:41
37 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:42
38 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:43
39 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:44
40 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:45
41 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:46
42 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:47
43 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:48
44 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:49
45 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:50
46 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:51
47 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:52
48 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:53
49 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:54
50 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:55
51 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:56
52 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:57
53 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:58
54 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:59
55 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:60
56 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:61
57 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:62
58 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:63
59 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:64
60 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:65
61 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:66
62 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:67
63 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:68
64 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:69
65 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:70
66 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:71
67 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:72
68 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:73
69 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:74
70 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:75
71 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:76
72 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:77
73 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:78
74 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:79
75 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:80
76 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:81
77 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:82
78 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:83
79 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:84
80 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:85
81 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:86
82 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:87
83 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:88
84 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:89
85 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:90
86 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:91
87 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:92
88 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:93
89 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:94
90 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:95
91 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:96
92 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:97
93 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:98
94 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:99
95 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:100
96 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:101
97 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:102
98 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:103
99 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:104
100 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:105
101 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:106
102 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:107
103 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:108
104 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:109
105 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:110
106 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:111
107 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:112
108 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:113
109 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:114
110 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:115
111 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:116
112 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:117
113 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:118
114 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:119
115 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:120
116 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:121
117 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:122
118 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:123
119 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:124
120 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:125
121 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:126
122 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:127
123 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:128
124 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:129
125 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:130
126 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:131
127 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:132
128 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:133
129 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:134
130 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:135
131 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:136
132 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:137
133 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:138
134 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:139
135 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:140
136 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:141
137 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:142
138 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:143
139 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:144
140 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:145
141 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:146
142 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:147
143 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:148
144 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:149
145 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:150
146 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:151
147 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:152
148 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:153
149 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:154
150 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:155
151 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:156
152 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:157
153 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:158
154 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:159
155 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:160
156 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:161
157 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:162
158 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:163
159 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:164
160 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:165
161 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:166
162 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:167
163 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:168
164 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:169
165 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:170
166 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:172
167 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:173
168 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:174
169 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:175
170 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:176
171 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:180
172 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:184
173 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:186
174 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:187
175 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:188
176 svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:189
Name: bzr:text-parents
-
+ gforge/utils/include.pl svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:102
gforge/utils/sql2ldif.pl svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:74
gforge/utils/underworld-dummy/ssh_dump.pl svn-v4:9d84d37e-dcb1-4aad-b103-6f3d92f53bf6:trunk:7
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/cvs1/cvs_history_parse.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/cvs1/cvs_history_parse.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/cvs1/cvs_history_parse.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,162 +0,0 @@
-#!/usr/bin/perl
-##
-## cvs_history_parse.pl
-##
-## NIGHTLY SCRIPT
-##
-## Recurses through the /cvsroot directory tree and parses each projects
-## '~/CVSROOT/history' file, building agregate stats on the number of
-## checkouts, commits, and adds to each project over the past 24 hours.
-##
-##
-## $Id: cvs_history_parse.pl,v 1.7 2000/11/03 02:17:31 tperdue Exp $
-##
-use strict;
-use Time::Local;
-use POSIX qw( strftime );
-
-my ($year, $month, $day, $day_begin, $day_end);
-my ($group, $histline, $daily_log_file, $key, $verbose);
-my $verbose = 0;
-my $base_log_dir = "/usr/local/boa/htdocs/cvslogs";
-
-$|=0 if $verbose;
-
- ## Set the time to collect stats for
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
-
- $day_begin = timegm( 0, 0, 0, $ARGV[2], $ARGV[1] - 1, $ARGV[0] - 1900 );
- $day_end = timegm( 0, 0, 0, (gmtime( $day_begin + 86400 ))[3,4,5] );
-
- $year = $ARGV[0];
- $month = $ARGV[1];
- $day = $ARGV[2];
-
-} else {
-
- ## Start at midnight last night.
- $day_end = timegm( 0, 0, 0, (gmtime( time() ))[3,4,5] );
- ## go until midnight yesterday.
- $day_begin = timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] );
-
- $year = strftime("%Y", gmtime( $day_begin ) );
- $month = strftime("%m", gmtime( $day_begin ) );
- $day = strftime("%d", gmtime( $day_begin ) );
-
-}
-
-<<<<<<< cvs_history_parse.pl
-<<<<<<< cvs_history_parse.pl
-my $daily_log_file;
-
-=======
-print "Parsing cvs logs looking for traffic on day $day, month $month, year $year.\n" if $verbose;
-
->>>>>>> 1.5
-if ( -d $base_log_dir ) {
- $daily_log_file = $base_log_dir . "/" . sprintf("%04d",$year);
- if ( ! -d $daily_log_file ) {
- mkdir( $daily_log_file, 0755 );
- }
- $daily_log_file .= "/" . sprintf("%02d", $month);
- if ( ! -d $daily_log_file ) {
- mkdir( $daily_log_file, 0755 );
- }
- $daily_log_file . "/cvs_traffic_" . sprintf("%04d%02d$02d",$year,$month,$day) . ".log";
-}
-=======
-if ( -d $base_log_dir ) {
- $daily_log_file = $base_log_dir . "/" . sprintf("%04d", $year);
- if ( ! -d $daily_log_file ) {
- print "Making dest dir \'$daily_log_file\'\n";
- mkdir( $daily_log_file, 0755 ) || die("Could not mkdir $daily_log_file");
- }
- $daily_log_file .= "/" . sprintf("%02d", $month);
- if ( ! -d $daily_log_file ) {
- print "Making dest dir \'$daily_log_file\'\n";
- mkdir( $daily_log_file, 0755 ) || die("Could not mkdir $daily_log_file");
- }
- $daily_log_file .= "/cvs_traffic_" . sprintf("%04d%02d%02d",$year,$month,$day) . ".log";
-} else {
- die("Base log directory \'$base_log_dir\' does not exist!");
-}
-
->>>>>>> 1.4
-open(DAYS_LOG, "> $daily_log_file") || die "Unable to open the log file \'$daily_log_file\'";
-print "Opened log file at \'$daily_log_file\' for writing...\n";
-print "Running tree at /cvsroot/\n";
-
-chdir( "/cvsroot" ) || die("Unable to make /cvsroot the working directory.\n");
-foreach $group ( glob("*") ) {
-
- next if ( ! -d "$group" );
-
- my ($cvs_co, $cvs_commit, $cvs_add, %usr_commit, %usr_add );
-
- open(HISTORY, "< /cvsroot/$group/CVSROOT/history") or print "E::Unable to open history for $group\n";
- while ( <HISTORY> ) {
- my ($time_parsed, $type, $cvstime, $user, $curdir, $module, $rev, $file );
-
- ## Split the cvs history entry into it's 6 fields.
- ($cvstime,$user,$curdir,$module,$rev,$file) = split(/\|/, $_, 6 );
-
- $type = substr($cvstime, 0, 1);
- $time_parsed = hex( substr($cvstime, 1, 8) );
-
- ## If the entry was made in the past 24 hours
- ## (i.e. - since the last run of this script...)
- if ( ($time_parsed > $day_begin) && ($time_parsed < $day_end) ) {
-
- ## log commits
- if ( $type eq "M" ) {
- $cvs_commit++;
- $usr_commit{$user}++;
- next;
- }
-
- ## log adds
- if ( $type eq "A" ) {
- $cvs_add++;
- $usr_add{$user}++;
- next;
- }
-
- ## log checkouts
- if ( $type eq "O" ) {
- $cvs_co++;
- ## we don't care about checkouts on a per-user
- ## most of them will be anon anyhow.
- next;
- }
-
- } elsif ( $time_parsed > $day_end ) {
- if ( $verbose >= 2 ) {
- print "Short circuting execution, parsed date exceeded current threshold.\n";
- }
- last;
- }
-
- }
- close( HISTORY );
-
- ## Now, we'll print all of the results for that project, in the following format:
- ## (G|U|E)::proj_name::user_name::checkouts::commits::adds
- ## If 'G', then record is group statistics, and field 2 is a space...
- ## If 'U', then record is per-user stats, and field 2 is the user name...
- ## If 'E', then record is an error, and field 1 is a description, there are no other fields.
- if ( $cvs_co || $cvs_commit || $cvs_add ) {
- print DAYS_LOG "G::" . $group . ":: ::" . ($cvs_co?$cvs_co:"0") . "::"
- . ($cvs_commit?$cvs_commit:"0") . "::" . ($cvs_add?$cvs_add:"0") . "\n";
-
- foreach $key ( keys %usr_commit ) {
-
- print DAYS_LOG "U::" . $group . "::" . $key . "::0::" . ($usr_commit{$key}?$usr_commit{$key}:"0")
- . "::" . ($usr_add{$key}?$usr_add{$key}:"0") . "\n";
- }
- }
-}
-print "Done processing cvs history file for this date.\n" if $verbose;
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/cvs1/run_span.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/cvs1/run_span.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/cvs1/run_span.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,23 +0,0 @@
-#!/usr/bin/perl
-use Time::Local;
-
-$script = "cvs_history_parse.pl";
-$span = $ARGV[0];
-$year = $ARGV[1];
-$month = $ARGV[2];
-$day = $ARGV[3];
-
-$| = 0;
-print "Processing $span day span from $month/$day/$year ...\n";
-
-for ( $i = 1; $i <= $span; $i++ ) {
-
- $command = "perl $script $year $month $day";
- print STDERR "Running \'$command\' from the current directory...\n";
- print STDERR `$command`;
-
- ($year,$month,$day) = (gmtime( timegm(0,0,0,$day + 1,$month - 1,$year - 1900) ))[5,4,3];
- $year += 1900;
- $month += 1;
-}
-
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_allagg.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_allagg.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_allagg.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,99 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_dlstats_allagg.pl,v 1.5 2000/08/10 16:54:32 tperdue Exp $
-#
-use DBI;
-
-require("../include.pl"); # Include all the predefined functions
-
-&db_connect;
-
-#once per year
-opendir DIRYEAR, "/home/log";
- at diryear = grep /\d\d\d\d/, readdir DIRYEAR;
-foreach $logyear (@diryear) {
- #now per month
- opendir DIRMONTH, "/home/log/$logyear";
- @dirmonth = grep /\d\d/, readdir DIRMONTH;
- foreach $logmonth (@dirmonth) {
- #now per combined_log
- opendir DIRENTRY, "/home/log/$logyear/$logmonth";
- @combinedlogs = grep /combined.*\.log$/, readdir DIRENTRY;
- foreach $combinedlog (@combinedlogs) {
- print "Processing $combinedlog...\n";
- if ($combinedlog =~ /\d\d\d\d\d\d(\d\d)/) {
- $logday = $1;
- } else {
- die ("Cannot find day in logfilename.");
- }
-
- undef %ct_file;
- undef %ct_group;
-
- $LOGFILE = "/home/log/$logyear/$logmonth/$combinedlog";
-
- open LOGFILE or die "Cannot open $LOGFILE";
- while (<LOGFILE>) {
- use integer;
- $logline = $_;
- if ($logline =~ /((\d|\.)+).*\[(\d+)\/(\w+)\/(\d+):(\d+):(\d+):(\d+)\s.*GET \/((?:\w|-)+)\/(.+)\sHTTP.+\s200\s(\d+)/ && !($9 eq 'mirrors') && !($9 eq 'pub') && !($9 eq 'debian')) {
- $grp = $9;
- $file = $10;
- $bytes = $11;
- $ip = $1;
-
- # Get time diff
- $mday = $3;
- if ($4 eq 'Jan') { $mon = "01"; }
- elsif ($4 eq 'Feb') { $mon = "02"; }
- elsif ($4 eq 'Mar') { $mon = "03"; }
- elsif ($4 eq 'Apr') { $mon = "04"; }
- elsif ($4 eq 'May') { $mon = "05"; }
- elsif ($4 eq 'Jun') { $mon = "06"; }
- elsif ($4 eq 'Jul') { $mon = "07"; }
- elsif ($4 eq 'Aug') { $mon = "08"; }
- elsif ($4 eq 'Sep') { $mon = "09"; }
- elsif ($4 eq 'Oct') { $mon = "10"; }
- elsif ($4 eq 'Nov') { $mon = "11"; }
- elsif ($4 eq 'Dec') { $mon = "12"; }
- $year = $5;
- $hour = $6;
- $min = $7;
- $sec = $8;
-
- $grp =~ s/%([0-9a-fA-F][0-9a-fA-F])/pack("C", hex($1))/eg;
- $file =~ s/%([0-9a-fA-F][0-9a-fA-F])/pack("C", hex($1))/eg;
-
- $ct_group{$grp}++;
- $ct_file{$grp}{$file}++;
- } #end per line of good regexed logfile
- } # end while processing logfile
-
- #delete all rows for this day
- my $query = "DELETE FROM frs_dlstats_agg WHERE day="
- .$logyear.$logmonth.$logday;
- my $rel = $dbh->prepare($query);
- $rel->execute();
-
- #now output the database rows
- while (($keygrp,$valgrp) = each (%ct_group)) {
- while (($keyfile,$valfile) = each (%{$ct_file{$keygrp}})) {
- #get fileid
- my $query = "SELECT filerelease.filerelease_id FROM filerelease,groups WHERE filerelease.group_id=groups.group_id AND groups.unix_group_name='$keygrp' AND filerelease.filename='$keyfile'";
- my $rel = $dbh->prepare($query);
- $rel->execute();
- ($filerelease_id) = $rel->fetchrow();
-
- my $query = "INSERT INTO frs_dlstats_agg "
- ."(file_id,day,downloads_http) VALUES "
- ."(".$filerelease_id.",".$logyear.$logmonth.$logday.","
- .$valfile.")";
- if ($filerelease_id > 0) {
- my $rel = $dbh->prepare($query);
- $rel->execute();
- }
- }
- }
- }
- }
-}
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_filetotal.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_filetotal.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_filetotal.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,27 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_dlstats_filetotal.pl,v 1.5 2000/08/10 16:54:32 tperdue Exp $
-#
-use DBI;
-
-require("../include.pl"); # Include all the predefined functions
-
-&db_connect;
-
-# doing this for all days for now
-my $query = "SELECT file_id, SUM(downloads_http + downloads_ftp) AS downloads "
- ."FROM frs_dlstats_agg GROUP BY file_id";
-my $rel = $dbh->prepare($query);
-$rel->execute();
-
-my $query = "DELETE FROM frs_dlstats_filetotal_agg";
-my $reldel = $dbh->prepare($query);
-$reldel->execute();
-
-# for each day
-while(my ($file_id,$downloads) = $rel->fetchrow()) {
- my $query = "INSERT INTO frs_dlstats_filetotal_agg (file_id,downloads) "
- ."VALUES (".$file_id.",".$downloads.")";
- my $reldb = $dbh->prepare($query);
- $reldb->execute();
-}
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grouptotal.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grouptotal.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grouptotal.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,27 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_dlstats_grouptotal.pl,v 1.5 2000/08/10 16:54:32 tperdue Exp $
-#
-use DBI;
-
-require("../include.pl"); # Include all the predefined functions
-
-&db_connect;
-
-# doing this for all days for now
-my $query = "SELECT group_id, SUM(downloads) AS downloads "
- ."FROM frs_dlstats_group_agg GROUP BY group_id";
-my $rel = $dbh->prepare($query);
-$rel->execute();
-
-my $query = "DELETE FROM frs_dlstats_grouptotal_agg";
-my $reldel = $dbh->prepare($query);
-$reldel->execute();
-
-# for each day
-while(my ($group_id,$downloads) = $rel->fetchrow()) {
- my $query = "INSERT INTO frs_dlstats_grouptotal_agg (group_id,downloads) "
- ."VALUES (".$group_id.",".$downloads.")";
- my $reldb = $dbh->prepare($query);
- $reldb->execute();
-}
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grp.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grp.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/db_dlstats_grp.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,42 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_dlstats_grp.pl,v 1.4 2000/08/10 16:54:32 tperdue Exp $
-#
-use DBI;
-
-require("../include.pl"); # Include all the predefined functions
-
-&db_connect;
-
-# doing this for all days for now
-my $query = "SELECT day FROM frs_dlstats_agg GROUP BY day";
-my $rel = $dbh->prepare($query);
-$rel->execute();
-
-# for each day
-while(my ($day) = $rel->fetchrow()) {
- print "Proecessing day $day...\n";
- undef(%daydl);
-
- my $query = "SELECT (frs_dlstats_agg.downloads_http + frs_dlstats_agg.downloads_ftp) "
- ."AS downloads, filerelease.group_id FROM filerelease,frs_dlstats_agg "
- ."WHERE filerelease.filerelease_id=frs_dlstats_agg.file_id AND day=$day";
- my $reldb = $dbh->prepare($query);
- $reldb->execute();
-
- while (my ($downloads, $group_id) = $reldb->fetchrow()) {
- $daydl{$group_id} += $downloads;
- }
-
- #drop previous rows
- my $query = "DELETE FROM frs_dlstats_group_agg WHERE day=$day";
- my $reldel = $dbh->prepare($query);
- $reldel->execute();
-
- while (($keygrp,$valdl) = each (%daydl)) {
- my $query = "INSERT INTO frs_dlstats_group_agg (group_id,day,downloads) "
- ."VALUES (".$keygrp.",".$day.",".$valdl.")";
- my $relins = $dbh->prepare($query);
- $relins->execute();
- }
-}
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/remission_filemaint.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/remission_filemaint.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/remission_filemaint.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,124 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: remission_filemaint.pl,v 1.4 2000/08/10 16:58:49 tperdue Exp $
-#
-use DBI;
-use File::Copy;
-
-require("../include.pl"); # Include all the predefined functions
-
-&db_connect;
-
-# grab Table information
-my $query = "SELECT groups.unix_group_name, filerelease.unix_partition, "
- . "filerelease.filename, filerelease.status, filerelease.filerelease_id, filerelease.old_filename "
- . "FROM groups,filerelease WHERE "
- . "filerelease.unix_box='remission' AND groups.group_id=filerelease.group_id";
-
-# if the quick option is selected, then only do for modified files;
-
-if ($ARGV[0] eq "quick") {
- print "Executing in quick mode.\n";
- $query .= " AND (filerelease.status='E' OR filerelease.status='M' OR filerelease.status='N')" ;
-}
-if (($ARGV[0] eq "verbose") or ($ARGV[1] eq "verbose")) {
- print "Executing in verbose mode.\n";
- $mode_v = 1;
-}
-
-my $c = $dbh->prepare($query);
-$c->execute();
-
-while(my ($unix_group_name,$unix_partition,$filename,$status,$filerelease_id,$old_filename) = $c->fetchrow()) {
- $fullpath = "/home/ftp/pub/sourceforge/$unix_group_name/";
- $newpath = "/home/ftp/incoming/";
- $newfilename = "${newpath}$filename";
- $fullfilename = "${fullpath}$filename";
- $fullfilenameold = "${fullpath}$old_filename";
- $delfilename = "${fullpath}~$filename";
-
- ######### ACTIVE FILES
- if ($status eq 'A') {
- # verify file exists in the right location
- if (!(-f "$fullfilename")) {
- print "[ERROR] - A Not Found - $fullfilename\n";
- }
- }
-
- ######### DELETED FILES
- elsif ($status eq 'D') {
- if (!(-f "$delfilename")) {
- print "[ERROR] - D Not Found - $delfilename\n";
- #fix it?
- if (-f "$fullfilename") {
- $command = "mv $fullfilename $delfilename";
- if (system($command)) {
- print "[ERROR] - Failed - $command\n";
- } else {
- print "[MOVE D FIX] - $command\n";
- }
- }
- }
- }
-
- ######### NEW FILES
- elsif ($status eq 'N') {
- if (-f "$newfilename") {
- $command = "mv $newfilename $fullfilename";
- if (system($command)) {
- print "[ERROR] - Failed - $command\n";
- } else {
- my $statusquery = "UPDATE filerelease SET status='A' WHERE filerelease_id=$filerelease_id";
- my $stat = $dbh->prepare($statusquery);
- $stat->execute();
- print "[MOVE NEW] - $command\n";
- }
- } else {
- print "[ERROR] - N Not Found - $newfilename\n";
- }
- }
-
- ######### FILE CHANGE, PENDING ACTIVE
- elsif ($status eq 'M') {
- if (-f "$fullfilenameold") {
- $command = "mv $fullfilenameold $fullfilename";
- if (system($command)) {
- print "[ERROR] - Failed - $command\n";
- } else {
- my $statusquery = "UPDATE filerelease SET status='A' WHERE filerelease_id=$filerelease_id";
- my $stat = $dbh->prepare($statusquery);
- $stat->execute();
- print "[MOVE] - $command\n";
- }
- } else {
- print "[ERROR] - M Old Not Found - $fullfilenameold";
- }
- }
-
- ######### FILE CHANGE, PENDING DELETE
- elsif ($status eq 'E') {
- if (-f "$fullfilenameold") {
- $command = "mv $fullfilenameold $delfilename";
- if (system($command)) {
- print "[ERROR] - Failed - $command\n";
- } else {
- my $statusquery = "UPDATE filerelease SET status='D' WHERE filerelease_id=$filerelease_id";
- my $stat = $dbh->prepare($statusquery);
- $stat->execute();
- print "[MOVE] - $command\n";
- }
- } elsif (-f "$fullfilename") {
- $command = "mv $fullfilename $delfilename";
- if (system($command)) {
- print "[ERROR] - Failed - $command\n";
- } else {
- my $statusquery = "UPDATE filerelease SET status='D' WHERE filerelease_id=$filerelease_id";
- my $stat = $dbh->prepare($statusquery);
- $stat->execute();
- print "[MOVE] - $command\n";
- }
- } else {
- print "[ERROR] - E Old Not Found - $fullfilenameold";
- }
- }
-}
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/stats_agr_filerelease.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/stats_agr_filerelease.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/stats_agr_filerelease.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,124 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: stats_agr_filerelease.pl,v 1.13 2000/09/21 03:44:12 msnelham Exp $
-#
-use DBI;
-require("../include.pl"); # Include all the predefined functions
-
-my $verbose = 1;
-
- ## if params were passed, we don't need to be running the agregates.
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
- print "Skipping the agregate build...\n" if $verbose;
- exit;
-}
-
-
-&db_connect;
-
-##
-## Begin by collecting universal data into RAM.
-##
-$sql = "SELECT group_id FROM groups WHERE status='A'";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- push( @groups, $tmp_ar[0] );
-}
-
-##
-## CREATE THE frs_dlstats_grouptotal_agg TABLE.
-##
-$sql = "DROP TABLE IF EXISTS frs_dlstats_grouptotal_agg_tmp";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
- ## Flush the downloads cache.
-%downloads = {};
-
- ## create the temp table;
-$sql = "CREATE TABLE frs_dlstats_grouptotal_agg_tmp ( "
- . "group_id int(11) DEFAULT '0' NOT NULL,"
- . "downloads int(11) DEFAULT '0' NOT NULL,"
- . "KEY idx_stats_agr_tmp_gid (group_id)"
- . ")";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
-$sql = "SELECT group_id,SUM(downloads) FROM stats_http_downloads GROUP BY group_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-$sql = "SELECT group_id,SUM(downloads) FROM stats_ftp_downloads GROUP BY group_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-foreach $group_id ( @groups ) {
- $xfers = $downloads{$group_id};
-
- $sql = "INSERT INTO frs_dlstats_grouptotal_agg_tmp VALUES ('$group_id','$xfers')";
- $rel = $dbh->do($sql) || die "SQL parse error: $!";
-}
-
- ## Drop the old agregate table
-$sql="DROP TABLE IF EXISTS frs_dlstats_grouptotal_agg";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
- ## Relocate the new table to take it's place.
-$sql="ALTER TABLE frs_dlstats_grouptotal_agg_tmp RENAME AS frs_dlstats_grouptotal_agg";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
-
-
-##
-## CREATE THE frs_dlstats_filetotal_agg TABLE.
-##
-$sql = "DROP TABLE IF EXISTS frs_dlstats_filetotal_agg_tmp";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
- ## Flush the downloads cache.
-%downloads = {};
-
- ## create the temp table;
-$sql = "CREATE TABLE frs_dlstats_filetotal_agg_tmp ( "
- . "file_id int(11) DEFAULT '0' NOT NULL,"
- . "downloads int(11) DEFAULT '0' NOT NULL,"
- . "KEY idx_stats_agr_tmp_fid (file_id)"
- . ")";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
-$sql = "SELECT filerelease_id,SUM(downloads) FROM stats_http_downloads GROUP BY filerelease_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-$sql = "SELECT filerelease_id,SUM(downloads) FROM stats_ftp_downloads GROUP BY filerelease_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-foreach $file_id ( keys %downloads ) {
- $xfers = $downloads{$file_id};
-
- $sql = "INSERT INTO frs_dlstats_filetotal_agg_tmp VALUES ('$file_id','$xfers')";
- $rel = $dbh->do($sql) || die "SQL parse error: $!";
-}
-
- ## Drop the old agregate table
-$sql="DROP TABLE IF EXISTS frs_dlstats_filetotal_agg";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
- ## Relocate the new table to take it's place.
-$sql="ALTER TABLE frs_dlstats_filetotal_agg_tmp RENAME AS frs_dlstats_filetotal_agg";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/stats_ftp_logparse.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/stats_ftp_logparse.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/stats_ftp_logparse.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,112 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: stats_ftp_logparse.pl,v 1.5 2000/08/26 00:05:10 msnelham Exp $
-#
-use DBI;
-use Time::Local;
-use POSIX qw( strftime );
-require("../include.pl"); # Include all the predefined functions
-
-#######################
-## CONF VARS
-
- my $verbose = 1;
- my $chronolog_basedir = "/home/log";
-
-##
-#######################
-
-my ( $filerel, $query, $rel, %groups, %filerelease, $bytes, $filepath, $group_name, $filename, $files );
-
-&db_connect;
-
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
- ## Set params manually, so we can run
- ## regressive log parses.
- $year = $ARGV[0];
- $month = $ARGV[1];
- $day = $ARGV[2];
-} else {
- ## Otherwise, we just parse the logs for yesterday.
- ($day, $month, $year) = (gmtime(timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] )))[3,4,5];
- $year += 1900;
- $month += 1;
-}
-
-$file = "$chronolog_basedir/$year/" . sprintf("%02d",$month) . "/ftp_xferlog_$year"
- . sprintf("%02d",$month) . sprintf("%02d",$day) . ".log";
-
-print "Running year $year, month $month, day $day from \'$file\'\n" if $verbose;
-print "Caching file release information out of the database..." if $verbose;
-
- ## It's makes things a whole lot faster for us if we cache the filerelease/group infor beforehand.
-$query = "SELECT frs_file.file_id,groups.group_id,groups.unix_group_name,frs_file.filename "
- . "FROM frs_file,frs_release,frs_package,groups "
- . "WHERE ( groups.group_id = frs_package.group_id "
- . "AND frs_package.package_id = frs_release.package_id "
- . "AND frs_release.release_id = frs_file.release_id )";
-$rel = $dbh->prepare($query);
-$rel->execute();
-while( $filerel = $rel->fetchrow_arrayref() ) {
- $file_ident = ${$filerel}[2] . ":" . ${$filerel}[3];
- $filerelease{$file_ident} = ${$filerel}[0];
- $groups{${$filerel}[0]} = ${$filerel}[1];
-}
-
-print " done.\n" if $verbose;
-
-if ( -f $file ) {
- open(LOGFILE, "< $file" ) || die "Cannot open $file";
-} elsif( -f "$file.gz" ) {
- $file .= ".gz";
- open(LOGFILE, "/usr/bin/gunzip -c $file |" ) || die "Cannot open gunzip pipe for $file";
-}
-
-print "Begining processing for logfile \'$file\'..." if $verbose;
-while (<LOGFILE>) {
- ## This commented out line, and the one below for $filepath, are for dates prior to 20000717
- ## if ( $_ =~ m/\/u7\/ftp\/pub\/sourceforge/ ) {
- if ( $_ =~ m/\/home\/ftp\/mounts\/u3\/sourceforge/ ) {
-
- $_ =~ m/^(\w+) (\w+)\s+(\d+) (\d\d):(\d\d):(\d\d) (\d\d\d\d) (\d+) ([^\s]+) (\d+) ([^\s]+) /;
- $bytes = $10;
- $filepath = $11;
-
- ## $filepath =~ m/^(\/home\/ftp\/mounts\/u3\/sourceforge\/)([^\/]+)\//;
- $filepath =~ m/^(\/home\/ftp\/mounts\/u3\/sourceforge\/)([^\/]+)\//;
- $group_name = $2;
-
- $filepath =~ m/\/([^\/]+)$/;
- $filename = $1;
-
- $file_ident = $group_name . ":" . $filename;
-
- if ( $filerelease{$file_ident} ) {
- $downloads{$filerelease{$file_ident}}++;
- }
- }
-}
-close(LOGFILE);
-
-print " done.\n" if $verbose;
-
-print "Deleting any existing records for day=" . sprintf("%d%02d%02d", $year, $month, $day) . ".\n" if $verbose;
-
-$query = "DELETE FROM stats_ftp_downloads WHERE day='" . sprintf("%d%02d%02d", $year, $month, $day) . "'";
-$dbh->do( $query );
-
-
-print "Inserting records into database: stats_ftp_downloads..." if $verbose;
-
-foreach $id ( keys %downloads ) {
- $query = "INSERT INTO stats_ftp_downloads (day,filerelease_id,group_id,downloads) ";
- $query .= "VALUES (\'" . sprintf("%d%02d%02d", $year, $month, $day) . "\',\'";
- $query .= $id . "\',\'" . $groups{$id} . "\',\'" . $downloads{$id} . "\')";
- $dbh->do( $query );
-}
-
-print " done.\n" if $verbose;
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/stats_http_logparse.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/stats_http_logparse.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/stats_http_logparse.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,114 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: stats_http_logparse.pl,v 1.5 2000/08/26 00:07:58 msnelham Exp $
-#
-use DBI;
-use Time::Local;
-use POSIX qw( strftime );
-require("../include.pl"); # Include all the predefined functions
-
-#######################
-## CONF VARS
-
- my $verbose = 1;
- my $chronolog_basedir = "/home/log";
-
-##
-#######################
-
-my ( $filerel, $query, $rel, %groups, %filerelease, $bytes, $filepath, $group_name, $filename, $files );
-
-&db_connect;
-
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
- ## Set params manually, so we can run
- ## regressive log parses.
- $year = $ARGV[0];
- $month = $ARGV[1];
- $day = $ARGV[2];
-} else {
- ## Otherwise, we just parse the logs for yesterday.
- ($day, $month, $year) = (gmtime(timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] )))[3,4,5];
- $year += 1900;
- $month += 1;
-}
-
-$file = "$chronolog_basedir/$year/" . sprintf("%02d",$month) . "/http_combined_$year"
- . sprintf("%02d%02d", $month, $day) . ".log";
-
-print "Running year $year, month $month, day $day from \'$file\'\n" if $verbose;
-print "Caching file release information out of the database..." if $verbose;
-
- ## It's makes things a whole lot faster for us if we cache the filerelease/group info beforehand.
-$query = "SELECT frs_file.file_id,groups.group_id,groups.unix_group_name,frs_file.filename "
- . "FROM frs_file,frs_release,frs_package,groups "
- . "WHERE ( groups.group_id = frs_package.group_id "
- . "AND frs_package.package_id = frs_release.package_id "
- . "AND frs_release.release_id = frs_file.release_id )";
-$rel = $dbh->prepare($query);
-$rel->execute();
-while( $filerel = $rel->fetchrow_arrayref() ) {
- $file_ident = ${$filerel}[2] . ":" . ${$filerel}[3];
- $filerelease{$file_ident} = ${$filerel}[0];
- $groups{${$filerel}[0]} = ${$filerel}[1];
-}
-
-print " done.\n" if $verbose;
-
-print "Begining processing for logfile \'$file\'..." if $verbose;
-
-if ( -f $file ) {
- open(LOGFILE, "< $file" ) || die "Cannot open $file";
-} elsif( -f "$file.gz" ) {
- open(LOGFILE, "/usr/bin/gunzip -c $file.gz |" ) || die "Cannot open gunzip pipe for $file.gz";
-}
-
-while (<LOGFILE>) {
-
- $_ =~ m/^([\d\.]+).*\[(.+)\]\s\"GET (.+) HTTP.+(\d\d\d)\s(\d+)/;
-
- $filepath = $3;
- $code = $4;
-
- if ( $code =~ m/2\d\d/ ) {
-
-
- $filepath =~ m/^\/([^\/]+)\//;
- $basedir = $1;
-
- if ( $basedir ne "mirrors" && $basedir ne "pub" && $basedir ne "debian" ) {
-
- $filepath =~ m/\/([^\/]+)$/;
- $filename = $1;
-
- $file_ident = $basedir . ":" . $filename;
-
- if ( $filerelease{$file_ident} ) {
- $downloads{$filerelease{$file_ident}}++;
- }
- }
- }
-}
-close(LOGFILE);
-
-print " done.\n" if $verbose;
-
-print "Deleting any existing records for day=" . sprintf("%d%02d%02d", $year, $month, $day) . ".\n" if $verbose;
-
-$query = "DELETE FROM stats_http_downloads WHERE day='" . sprintf("%d%02d%02d", $year, $month, $day) . "'";
-$dbh->do( $query );
-
-print "Inserting records into database: stats_http_downloads..." if $verbose;
-
-foreach $id ( keys %downloads ) {
- $query = "INSERT INTO stats_http_downloads (day,filerelease_id,group_id,downloads) ";
- $query .= "VALUES (\'" . sprintf("%d%02d%02d", $year, $month, $day) . "\',\'";
- $query .= $id . "\',\'" . $groups{$id} . "\',\'" . $downloads{$id} . "\')";
- $dbh->do( $query );
-}
-
-print " done.\n" if $verbose;
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/stats_logparse.sh
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/stats_logparse.sh 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/stats_logparse.sh 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,15 +0,0 @@
-#!/bin/sh
-
-cd /root/bin/alexandria/utils/download
-
-## parse each logfile set
-./stats_ftp_logparse.pl $*
-./stats_http_logparse.pl $*
-
-## and then build the agregates
-./stats_agr_filerelease.pl $*
-
-## after which, we update the nightly agregates
-./stats_nightly_filerelease.pl $*
-
-
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/download/stats_nightly_filerelease.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/download/stats_nightly_filerelease.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/download/stats_nightly_filerelease.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,105 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: stats_nightly_filerelease.pl,v 1.2 2000/08/23 01:31:15 msnelham Exp $
-#
-use DBI;
-use Time::Local;
-require("../include.pl"); # Include all the predefined functions
-
-my $verbose = 1;
-
-&db_connect;
-
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
- ## Set params manually, so we can run
- ## regressive log parses.
- $year = $ARGV[0];
- $month = $ARGV[1];
- $day = $ARGV[2];
-} else {
- ## Otherwise, we just parse the logs for yesterday.
- ($day, $month, $year) = (gmtime(timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] )))[3,4,5];
- $year += 1900;
- $month += 1;
-}
-
-$today = sprintf("%04d%02d%02d", $year, $month, $day);
-print "Running year $year, month $month, day $day.\n" if $verbose;
-
-##
-## POPULATE THE frs_dlstats_group_agg TABLE.
-##
-$sql = "SELECT group_id,SUM(downloads) FROM stats_http_downloads "
- . "WHERE ( day = '$today' ) GROUP BY group_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-$sql = "SELECT group_id,SUM(downloads) FROM stats_ftp_downloads "
- . "WHERE ( day = '$today' ) GROUP BY group_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-$sql = "DELETE FROM frs_dlstats_group_agg WHERE day='$today'";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-foreach $group_id ( keys %downloads ) {
- $xfers = $downloads{$group_id};
- $total_xfers += $xfers;
- $sql = "INSERT INTO frs_dlstats_group_agg VALUES ('$group_id','$today','$xfers')";
- $rel = $dbh->do($sql) || die "SQL parse error: $!";
-}
-
-
- ## do some housekeeping before the next set.
-%downloads = {};
-$first_xfers = $total_xfers;
-$total_xfers = 0;
-
-
-##
-## POPULATE THE frs_dlstats_file_agg TABLE.
-##
-$sql = "SELECT filerelease_id,SUM(downloads) FROM stats_http_downloads "
- . "WHERE ( day = '$today' ) GROUP BY filerelease_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-$sql = "SELECT filerelease_id,SUM(downloads) FROM stats_ftp_downloads "
- . "WHERE ( day = '$today' ) GROUP BY filerelease_id";
-$rel = $dbh->prepare($sql) || die "SQL parse error: $!";
-$rel->execute() || die "SQL execute error: $!";
-while ( @tmp_ar = $rel->fetchrow_array() ) {
- $downloads{ $tmp_ar[0] } += $tmp_ar[1];
-}
-
-$sql = "DELETE FROM frs_dlstats_file_agg WHERE day='$today'";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-foreach $file_id ( keys %downloads ) {
- $xfers = $downloads{$file_id};
- $total_xfers += $xfers;
- $sql = "INSERT INTO frs_dlstats_file_agg VALUES ('$file_id','$today','$xfers')";
- $rel = $dbh->do($sql) || die "SQL parse error: $!";
-}
-
-
-##
-## POPULATE THE downloads ROW OF THE stats_site TABLE
-##
-
-if ( $total_xfers != $first_xfers ) {
- print "stats_nightly_filerelease.pl: THE TRANSER STATS DID NOT AGREE!! FIX ME!!\n";
-}
-$sql = "UPDATE stats_site SET downloads='$total_xfers' WHERE (month='" . sprintf("%04d%02d", $year, $month) . "' AND day='$day') ";
-$rel = $dbh->do($sql) || die "SQL parse error: $!";
-
-##
-## EOF
-##
Modified: trunk/gforge_base/evolvisforge/gforge/utils/include.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/include.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/include.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,3 +1,5 @@
+#!/usr/bin/perl
+#
# $Id$
#
# include.pl - Include file for all the perl scripts that contains reusable functions
@@ -6,15 +8,15 @@
##############################
# Global Variables
##############################
-$db_include = "/etc/sourceforge/local.pl"; # Local Include file for database username and password
+$db_include = "/etc/local.inc"; # Local Include file for database username and password
$tar_dir = "/tmp"; # Place to put deleted user's accounts
$uid_add = "20000"; # How much to add to the database uid to get the unix uid
-$gid_add = "10000"; # How much to add to the database gid to get the unix uid
-$homedir_prefix = "/var/lib/sourceforge/chroot/home/users/"; # What prefix to add to the user's homedir
-$grpdir_prefix = "/var/lib/sourceforge/chroot/home/groups/"; # What prefix to add to the user's homedir
-$file_dir = "/var/lib/sourceforge/"; # Where should we stick files we're working with
-$cvs_root = "/var/lib/sourceforge/chroot/cvsroot/"; # Where should we stick files we're working with
-$dummy_uid = "9999"; # UserID of the dummy user that will own group's files
+$gid_add = "1000"; # How much to add to the database gid to get the unix uid
+$anoncvs_add = "2000"; # How much to add to the gid to get the unix uid of anoncvs user
+$homedir_prefix = "/home/users/"; # What prefix to add to the user's homedir
+$grpdir_prefix = "/home/groups/"; # What prefix to add to the user's homedir
+$file_dir = "/home/dummy/dumps/"; # Where should we stick files we're working with
+$dummy_uid = "103"; # UserID of the dummy user that will own group's files
$date = int(time()/3600/24); # Get the number of days since 1/1/1970 for /etc/shadow
$ldap_prefix = "/usr/local/ldap/bin/"; # Where OpenLDAP tools installed
@@ -22,16 +24,16 @@
# Configuration parsing Functions
##################################
sub parse_local_inc {
- require $db_include;
-# my ($foo, $bar);
-# # open up database include file and get the database variables
-# open(FILE, $db_include) || die "Can't open $db_include: $!\n";
-# while (<FILE>) {
-# next if ( /^\s*\/\// );
-# ($foo, $bar) = split /=/;
-# if ($foo) { eval $_ };
-# }
-# close(FILE);
+ my ($foo, $bar);
+
+ # open up database include file and get the database variables
+ open(FILE, $db_include) || die "Can't open $db_include: $!\n";
+ while (<FILE>) {
+ next if ( /^\s*\/\// );
+ ($foo, $bar) = split /=/;
+ if ($foo) { eval $_ };
+ }
+ close(FILE);
}
##############################
@@ -41,26 +43,12 @@
&parse_local_inc;
# connect to the database
- $dbh ||= DBI->connect("DBI:Pg:dbname=$sys_dbname;host=$sys_dbhost", "$sys_dbuser", "$sys_dbpasswd");
+ $dbh ||= DBI->connect("DBI:Pg:dbname=$sys_dbname;host=$sys_dbhost;user=$sys_dbuser;password=$sys_dbpasswd");
+ #$dbh ||= DBI->connect("DBI:mysql:$sys_dbname:$sys_dbhost", "$sys_dbuser", "$sys_dbpasswd");
die "Cannot connect to database: $!" if ( ! $dbh );
}
-sub db_drop_table_if_exists {
- my ($sql, $res, $n, $tn) ;
- $tn = shift ;
- $sql = "SELECT COUNT(*) FROM pg_class WHERE relname='$tn'";
- $res = $dbh->prepare($sql);
- $res->execute();
- ($n) = $res->fetchrow() ;
- $res->finish () ;
- if ($n != 0) {
- $sql = "DROP TABLE $tn";
- $res = $dbh->prepare($sql);
- $res->finish () ;
- }
-}
-
##############################
# File open function, spews the entire file to an array.
##############################
Added: trunk/gforge_base/evolvisforge/gforge/utils/include_2_5.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/include_2_5.pl (rev 0)
+++ trunk/gforge_base/evolvisforge/gforge/utils/include_2_5.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -0,0 +1,90 @@
+# $Id$
+#
+# include.pl - Include file for all the perl scripts that contains reusable functions
+#
+
+##############################
+# Global Variables
+##############################
+$db_include = "/etc/sourceforge/local.pl"; # Local Include file for database username and password
+$tar_dir = "/tmp"; # Place to put deleted user's accounts
+$uid_add = "20000"; # How much to add to the database uid to get the unix uid
+$gid_add = "10000"; # How much to add to the database gid to get the unix uid
+$homedir_prefix = "/var/lib/sourceforge/chroot/home/users/"; # What prefix to add to the user's homedir
+$grpdir_prefix = "/var/lib/sourceforge/chroot/home/groups/"; # What prefix to add to the user's homedir
+$file_dir = "/var/lib/sourceforge/"; # Where should we stick files we're working with
+$cvs_root = "/var/lib/sourceforge/chroot/cvsroot/"; # Where should we stick files we're working with
+$dummy_uid = "9999"; # UserID of the dummy user that will own group's files
+$date = int(time()/3600/24); # Get the number of days since 1/1/1970 for /etc/shadow
+$ldap_prefix = "/usr/local/ldap/bin/"; # Where OpenLDAP tools installed
+
+##################################
+# Configuration parsing Functions
+##################################
+sub parse_local_inc {
+ require $db_include;
+# my ($foo, $bar);
+# # open up database include file and get the database variables
+# open(FILE, $db_include) || die "Can't open $db_include: $!\n";
+# while (<FILE>) {
+# next if ( /^\s*\/\// );
+# ($foo, $bar) = split /=/;
+# if ($foo) { eval $_ };
+# }
+# close(FILE);
+}
+
+##############################
+# Database Connect Functions
+##############################
+sub db_connect {
+ &parse_local_inc;
+
+ # connect to the database
+ $dbh ||= DBI->connect("DBI:Pg:dbname=$sys_dbname;host=$sys_dbhost", "$sys_dbuser", "$sys_dbpasswd");
+
+ die "Cannot connect to database: $!" if ( ! $dbh );
+}
+
+sub db_drop_table_if_exists {
+ my ($sql, $res, $n, $tn) ;
+ $tn = shift ;
+ $sql = "SELECT COUNT(*) FROM pg_class WHERE relname='$tn'";
+ $res = $dbh->prepare($sql);
+ $res->execute();
+ ($n) = $res->fetchrow() ;
+ $res->finish () ;
+ if ($n != 0) {
+ $sql = "DROP TABLE $tn";
+ $res = $dbh->prepare($sql);
+ $res->finish () ;
+ }
+}
+
+##############################
+# File open function, spews the entire file to an array.
+##############################
+sub open_array_file {
+ my $filename = shift(@_);
+
+ open (FD, $filename) || die "Can't open $filename: $!.\n";
+ @tmp_array = <FD>;
+ close(FD);
+
+ return @tmp_array;
+}
+
+#############################
+# File write function.
+#############################
+sub write_array_file {
+ my ($file_name, @file_array) = @_;
+
+ open(FD, ">$file_name") || die "Can't open $file_name: $!.\n";
+ foreach (@file_array) {
+ if ($_ ne '') {
+ print FD;
+ }
+ }
+ close(FD);
+}
Modified: trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -6,21 +6,26 @@
# ./sql2ldif.pl : Dump only top-level ou map
# ./sql2ldif.pl --full : Dump full database (ouch!)
#
-# $Id: sql2ldif.pl,v 1.8 2000/12/10 23:07:31 pfalcon Exp $
+# $Id: sql2ldif.pl,v 1.13 2001/03/26 20:38:01 pfalcon Exp $
#
use DBI;
#require("base64.pl"); # Include all the predefined functions
-require("/usr/lib/sourceforge/lib/include.pl"); # Include all the predefined functions
-$chroot="/var/lib/sourceforge/chroot";
+require("include.pl"); # Include all the predefined functions
&db_connect;
+
+sub homedir {
+ my ($user) = @_;
+ return "/home/users/".substr($user,0,1)."/".substr($user,0,2)."/$user";
+}
+
dump_header();
-# if (!($#ARGV+1)) {
-# exit;
-# }
+if (!($#ARGV+1)) {
+ exit;
+}
#
# Dump user entries (ou=People)
@@ -28,11 +33,11 @@
# We give user maximum of privileges assigned to one by groups ;-(
my $query = "
-SELECT user_name,realname,shell,unix_pw,unix_uid,MAX(cvs_flags),email
+SELECT user_name,realname,shell,unix_pw,unix_uid,MAX(cvs_flags)
FROM users,user_group
WHERE unix_status='A'
AND users.user_id=user_group.user_id
-GROUP BY user_name,realname,shell,unix_pw,unix_uid,email
+GROUP BY user_name,realname,shell,unix_pw,unix_uid
";
my $rel = $dbh->prepare($query);
$rel->execute();
@@ -42,10 +47,12 @@
@cvs_flags2shell=('/dev/null','/bin/cvssh','/bin/bash');
-while(my ($username, $realname, $shell, $pw, $uid, $cvs_flags, $email) = $rel->fetchrow()) {
+#
+# Note: unix uid = db uxix_uid + $uid_add
+#
+while(my ($username, $realname, $shell, $pw, $uid, $cvs_flags) = $rel->fetchrow()) {
+ $uid+=$uid_add;
print "dn: uid=$username,ou=People,$sys_ldap_base_dn\n";
- #CB# To have the same id than generated by new_parse
- $uid += $uid_add;
print "uid: $username\n";
if (!$realname) { $realname='?'; }
$realname=~tr#\x80-\xff#?#; # it should be UTF-8 encoded, we just drop non-ascii chars
@@ -56,31 +63,18 @@
objectClass: shadowAccount
objectClass: x-sourceforgeAccount
";
- #CB# gid was 100, i replace with $gid=$uid
- $gid = $uid;
print "userPassword: {crypt}$pw
-shadowLastChange: 10879
+shadowLastChange: 1
shadowMax: 99999
shadowWarning: 7
loginShell: $shell
x-cvsShell: $cvs_flags2shell[$cvs_flags]
uidNumber: $uid
-gidNumber: $gid
-homeDirectory: $chroot/home/users/$username
+gidNumber: 100
+homeDirectory: ".homedir($username)."
gecos: $realname
-x-forward-email: $email
";
- #CB# To have the same id than generated by new_parse
- #CB# A group per user
- print "dn: cn=$username,ou=Group,$sys_ldap_base_dn
-objectClass: posixGroup
-objectClass: top
-cn: $username
-userPassword: {crypt}x
-gidNumber: $gid
-
-";
}
#
@@ -88,64 +82,35 @@
#
my $query = "
-SELECT group_id,unix_group_name
-FROM groups
-WHERE status='A'
+SELECT groups.group_id,unix_group_name,user_name
+FROM groups,users,user_group
+WHERE groups.status='A'
+AND groups.group_id=user_group.group_id
+AND user_group.user_id=users.user_id
+ORDER BY groups.group_id
";
my $rel = $dbh->prepare($query);
$rel->execute();
-while(my ($gid, $groupname) = $rel->fetchrow()) {
- my $query = "
-SELECT user_name
-FROM users,user_group
-WHERE group_id=$gid
- AND users.user_id=user_group.user_id
-";
- my $rel = $dbh->prepare($query);
- $rel->execute();
+#
+# Note: unix gid = db group_id + $gid_add
+#
+$last_gid=-1;
+while(my ($gid, $groupname, $member) = $rel->fetchrow()) {
+ $gid+=$gid_add;
- #CB# To have the same id than generated by new_parse
- $gid += $gid_add;
- print "dn: cn=$groupname,ou=Group,$sys_ldap_base_dn
+ if ($gid != $last_gid) {
+ print "\ndn: cn=$groupname,ou=Group,$sys_ldap_base_dn
objectClass: posixGroup
objectClass: top
cn: $groupname
userPassword: {crypt}x
gidNumber: $gid
";
-
- while(my ($username) = $rel->fetchrow()) {
- print "memberUid: $username\n";
+ $last_gid=$gid;
}
- print "\n";
-}
-#
-# Dump mailing-lists entries (ou=mailingList)
-#
-
-$query = "SELECT mail_group_list.group_list_id,
- mail_group_list.list_name,
- users.user_name,
- mail_group_list.password,
- mail_group_list.description
- FROM mail_group_list, users
- WHERE mail_group_list.status = 3
- AND mail_group_list.list_admin = users.user_id" ;
-$rel = $dbh->prepare($query);
-$rel->execute();
-
-while(my ($group_list_id, $listname, $user_name, $password, $description) = $rel->fetchrow()) {
- print "dn: cn=$listname,ou=mailingList,$sys_ldap_base_dn
-objectClass: x-sourceforgeMailingList
-objectClass: top
-cn: $listname
-listPostAddress: \"|/var/lib/mailman/mail/wrapper post $listname\"
-listOwnerAddress: \"|/var/lib/mailman/mail/wrapper mailowner $listname\"
-listRequestAddress: \"|/var/lib/mailman/mail/wrapper mailcmd $listname\"
-";
- print "\n";
+ print "memberUid: $member\n";
}
#
@@ -153,38 +118,62 @@
#
my $query = "
-SELECT group_id,unix_group_name
-FROM groups
-WHERE status='A'
+SELECT groups.group_id,unix_group_name,user_name,cvs_flags
+FROM groups,users,user_group
+WHERE groups.status='A'
+AND groups.group_id=user_group.group_id
+AND user_group.user_id=users.user_id
+ORDER BY groups.group_id
";
+# we need cvsGroup even if no member has permission
+#AND user_group.cvs_flags > 0
my $rel = $dbh->prepare($query);
$rel->execute();
-while(my ($gid, $groupname) = $rel->fetchrow()) {
- my $query = "
-SELECT user_name
-FROM users,user_group
-WHERE group_id=$gid
- AND users.user_id=user_group.user_id
- AND user_group.cvs_flags > 0
-";
- my $rel = $dbh->prepare($query);
- $rel->execute();
+$last_gid=-1;
+while(my ($gid, $groupname, $member, $cvs) = $rel->fetchrow()) {
+ $gid+=$gid_add;
- #CB# To have the same id than generated by new_parse
- $gid += $gid_add;
- print "dn: cn=$groupname,ou=cvsGroup,$sys_ldap_base_dn
+ if ($gid != $last_gid) {
+
+ # virtual member for anoncvs access
+ print "\ndn: uid=anoncvs_$groupname,ou=People,$sys_ldap_base_dn\n";
+ print "uid: anoncvs_$groupname\n";
+ print "cn: anoncvs\n";
+ print "objectClass: account
+objectClass: posixAccount
+objectClass: top
+objectClass: shadowAccount
+objectClass: x-sourceforgeAccount
+";
+ print "userPassword: {crypt}x
+shadowLastChange: 1
+shadowMax: 99999
+shadowWarning: 7
+loginShell: /bin/false
+x-cvsShell: /bin/false
+";
+ print "uidNumber: ",$gid+$anoncvs_add;
+ print "
+gidNumber: $gid
+homeDirectory: ".homedir("anoncvs_$groupname")."
+gecos: anoncvs
+";
+ # CVS group itself
+ print "\ndn: cn=$groupname,ou=cvsGroup,$sys_ldap_base_dn
objectClass: posixGroup
objectClass: top
cn: $groupname
userPassword: {crypt}x
gidNumber: $gid
+memberUid: anoncvs_$groupname
";
+ $last_gid=$gid;
+ }
- while(my ($username) = $rel->fetchrow()) {
- print "memberUid: $username\n";
+ if ($cvs>0) {
+ print "memberUid: $member\n";
}
- print "\n";
}
#
@@ -234,31 +223,13 @@
objectClass: domainRelatedObject
associatedDomain: $sys_default_domain
-dn: ou=mailingList,$sys_ldap_base_dn
-ou: mailingList
+dn: cn=Replicator,dc=sourceforge,dc=net
+cn: Replicator
+sn: Replicator the Robot
+description: empty
objectClass: top
-objectClass: organizationalUnit
-objectClass: domainRelatedObject
-associatedDomain: $sys_lists_host
-
-dn: uid=dummy,ou=People,$sys_ldap_base_dn
-uid: dummy
-cn: Dummy User
-objectClass: account
-objectClass: posixAccount
-objectClass: top
-objectClass: shadowAccount
-objectClass: x-sourceforgeAccount
+objectClass: person
userPassword: {crypt}x
-shadowLastChange: 10879
-shadowMax: 99999
-shadowWarning: 7
-loginShell: /bin/false
-x-cvsShell: /bin/false
-uidNumber: $dummy_uid
-gidNumber: $dummy_uid
-homeDirectory: $chroot/home/users/dummy
-gecos: Dummy User
";
}
Added: trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif_2_5.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif_2_5.pl (rev 0)
+++ trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif_2_5.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -0,0 +1,264 @@
+#!/usr/bin/perl
+#
+# Convert SQL user database to LDIF format (for SourceForge LDAP schema)
+# by pfalcon at users.sourceforge.net 2000-10-17
+#
+# ./sql2ldif.pl : Dump only top-level ou map
+# ./sql2ldif.pl --full : Dump full database (ouch!)
+#
+# $Id$
+#
+
+use DBI;
+
+#require("base64.pl"); # Include all the predefined functions
+require("/usr/lib/sourceforge/lib/include.pl"); # Include all the predefined functions
+$chroot="/var/lib/sourceforge/chroot";
+&db_connect;
+
+dump_header();
+
+# if (!($#ARGV+1)) {
+# exit;
+# }
+
+#
+# Dump user entries (ou=People)
+#
+
+# We give user maximum of privileges assigned to one by groups ;-(
+my $query = "
+SELECT user_name,realname,shell,unix_pw,unix_uid,MAX(cvs_flags),email
+FROM users,user_group
+WHERE unix_status='A'
+ AND users.user_id=user_group.user_id
+GROUP BY user_name,realname,shell,unix_pw,unix_uid,email
+";
+my $rel = $dbh->prepare($query);
+$rel->execute();
+
+#print "$sys_ldap_host\n";
+#print "$sys_ldap_base_dn\n";
+
+ at cvs_flags2shell=('/dev/null','/bin/cvssh','/bin/bash');
+
+while(my ($username, $realname, $shell, $pw, $uid, $cvs_flags, $email) = $rel->fetchrow()) {
+ print "dn: uid=$username,ou=People,$sys_ldap_base_dn\n";
+ #CB# To have the same id than generated by new_parse
+ $uid += $uid_add;
+ print "uid: $username\n";
+ if (!$realname) { $realname='?'; }
+ $realname=~tr#\x80-\xff#?#; # it should be UTF-8 encoded, we just drop non-ascii chars
+ print "cn: $realname\n";
+ print "objectClass: account
+objectClass: posixAccount
+objectClass: top
+objectClass: shadowAccount
+objectClass: x-sourceforgeAccount
+";
+ #CB# gid was 100, i replace with $gid=$uid
+ $gid = $uid;
+ print "userPassword: {crypt}$pw
+shadowLastChange: 10879
+shadowMax: 99999
+shadowWarning: 7
+loginShell: $shell
+x-cvsShell: $cvs_flags2shell[$cvs_flags]
+uidNumber: $uid
+gidNumber: $gid
+homeDirectory: $chroot/home/users/$username
+gecos: $realname
+x-forward-email: $email
+
+";
+ #CB# To have the same id than generated by new_parse
+ #CB# A group per user
+ print "dn: cn=$username,ou=Group,$sys_ldap_base_dn
+objectClass: posixGroup
+objectClass: top
+cn: $username
+userPassword: {crypt}x
+gidNumber: $gid
+
+";
+}
+
+#
+# Dump group entries (ou=Group)
+#
+
+my $query = "
+SELECT group_id,unix_group_name
+FROM groups
+WHERE status='A'
+";
+my $rel = $dbh->prepare($query);
+$rel->execute();
+
+while(my ($gid, $groupname) = $rel->fetchrow()) {
+ my $query = "
+SELECT user_name
+FROM users,user_group
+WHERE group_id=$gid
+ AND users.user_id=user_group.user_id
+";
+ my $rel = $dbh->prepare($query);
+ $rel->execute();
+
+ #CB# To have the same id than generated by new_parse
+ $gid += $gid_add;
+ print "dn: cn=$groupname,ou=Group,$sys_ldap_base_dn
+objectClass: posixGroup
+objectClass: top
+cn: $groupname
+userPassword: {crypt}x
+gidNumber: $gid
+";
+
+ while(my ($username) = $rel->fetchrow()) {
+ print "memberUid: $username\n";
+ }
+ print "\n";
+}
+
+#
+# Dump mailing-lists entries (ou=mailingList)
+#
+
+$query = "SELECT mail_group_list.group_list_id,
+ mail_group_list.list_name,
+ users.user_name,
+ mail_group_list.password,
+ mail_group_list.description
+ FROM mail_group_list, users
+ WHERE mail_group_list.status = 3
+ AND mail_group_list.list_admin = users.user_id" ;
+$rel = $dbh->prepare($query);
+$rel->execute();
+
+while(my ($group_list_id, $listname, $user_name, $password, $description) = $rel->fetchrow()) {
+ print "dn: cn=$listname,ou=mailingList,$sys_ldap_base_dn
+objectClass: x-sourceforgeMailingList
+objectClass: top
+cn: $listname
+listPostAddress: \"|/var/lib/mailman/mail/wrapper post $listname\"
+listOwnerAddress: \"|/var/lib/mailman/mail/wrapper mailowner $listname\"
+listRequestAddress: \"|/var/lib/mailman/mail/wrapper mailcmd $listname\"
+";
+ print "\n";
+}
+
+#
+# Dump CVS group entries (ou=cvsGroup)
+#
+
+my $query = "
+SELECT group_id,unix_group_name
+FROM groups
+WHERE status='A'
+";
+my $rel = $dbh->prepare($query);
+$rel->execute();
+
+while(my ($gid, $groupname) = $rel->fetchrow()) {
+ my $query = "
+SELECT user_name
+FROM users,user_group
+WHERE group_id=$gid
+ AND users.user_id=user_group.user_id
+ AND user_group.cvs_flags > 0
+";
+ my $rel = $dbh->prepare($query);
+ $rel->execute();
+
+ #CB# To have the same id than generated by new_parse
+ $gid += $gid_add;
+ print "dn: cn=$groupname,ou=cvsGroup,$sys_ldap_base_dn
+objectClass: posixGroup
+objectClass: top
+cn: $groupname
+userPassword: {crypt}x
+gidNumber: $gid
+";
+
+ while(my ($username) = $rel->fetchrow()) {
+ print "memberUid: $username\n";
+ }
+ print "\n";
+}
+
+#
+# Auxilary functions
+#
+
+sub dump_header {
+ print "dn: $sys_ldap_base_dn
+dc: sourceforge
+objectClass: top
+objectClass: domain
+objectClass: domainRelatedObject
+associatedDomain: $sys_default_domain
+
+dn: ou=Hosts,$sys_ldap_base_dn
+ou: Hosts
+objectClass: top
+objectClass: organizationalUnit
+objectClass: domainRelatedObject
+associatedDomain: $sys_default_domain
+
+dn: ou=People,$sys_ldap_base_dn
+ou: People
+objectClass: top
+objectClass: organizationalUnit
+objectClass: domainRelatedObject
+associatedDomain: $sys_default_domain
+
+dn: ou=Aliases,$sys_ldap_base_dn
+ou: Aliases
+objectClass: top
+objectClass: organizationalUnit
+objectClass: domainRelatedObject
+associatedDomain: $sys_default_domain
+
+dn: ou=Group,$sys_ldap_base_dn
+ou: Group
+objectClass: top
+objectClass: organizationalUnit
+objectClass: domainRelatedObject
+associatedDomain: $sys_default_domain
+
+dn: ou=cvsGroup,$sys_ldap_base_dn
+ou: cvsGroup
+objectClass: top
+objectClass: organizationalUnit
+objectClass: domainRelatedObject
+associatedDomain: $sys_default_domain
+
+dn: ou=mailingList,$sys_ldap_base_dn
+ou: mailingList
+objectClass: top
+objectClass: organizationalUnit
+objectClass: domainRelatedObject
+associatedDomain: $sys_lists_host
+
+dn: uid=dummy,ou=People,$sys_ldap_base_dn
+uid: dummy
+cn: Dummy User
+objectClass: account
+objectClass: posixAccount
+objectClass: top
+objectClass: shadowAccount
+objectClass: x-sourceforgeAccount
+userPassword: {crypt}x
+shadowLastChange: 10879
+shadowMax: 99999
+shadowWarning: 7
+loginShell: /bin/false
+x-cvsShell: /bin/false
+uidNumber: $dummy_uid
+gidNumber: $dummy_uid
+homeDirectory: $chroot/home/users/dummy
+gecos: Dummy User
+
+";
+}
Property changes on: trunk/gforge_base/evolvisforge/gforge/utils/sql2ldif_2_5.pl
___________________________________________________________________
Name: svn:executable
+ *
Modified: trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -6,20 +6,23 @@
#
use DBI;
-require("/usr/lib/sourceforge/lib/include.pl"); # Include all the predefined functions
+require("../include.pl"); # Include all the predefined functions
my $ssh_array = ();
&db_connect;
# Dump the Table information
-$query = "SELECT user_name,unix_uid,authorized_keys FROM users WHERE authorized_keys != ''";
+$query = "SELECT user_name,authorized_keys FROM users WHERE authorized_keys != \"\" and authorized_keys is not null";
$c = $dbh->prepare($query);
$c->execute();
-while(my ($username, $unix_uid, $ssh_key) = $c->fetchrow()) {
- $new_list = "$username:$unix_uid:$ssh_key\n";
+while(my ($username, $ssh_key) = $c->fetchrow()) {
+
+ $new_list = "$username:$ssh_key\n";
+
push @ssh_array, $new_list;
}
+
# Now write out the files
-write_array_file($file_dir."dumps/ssh_dump", @ssh_array);
+write_array_file($file_dir."ssh_dump", @ssh_array);
Added: trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump_2_5.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump_2_5.pl (rev 0)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump_2_5.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -0,0 +1,25 @@
+#!/usr/bin/perl
+#
+# $Id$
+#
+# ssh_dump.pl - Script to suck data outta the database to be processed by ssh_create.pl
+#
+use DBI;
+
+require("/usr/lib/sourceforge/lib/include.pl"); # Include all the predefined functions
+
+my $ssh_array = ();
+
+&db_connect;
+
+# Dump the Table information
+$query = "SELECT user_name,unix_uid,authorized_keys FROM users WHERE authorized_keys != ''";
+$c = $dbh->prepare($query);
+$c->execute();
+while(my ($username, $unix_uid, $ssh_key) = $c->fetchrow()) {
+ $new_list = "$username:$unix_uid:$ssh_key\n";
+ push @ssh_array, $new_list;
+}
+
+# Now write out the files
+write_array_file($file_dir."dumps/ssh_dump", @ssh_array);
Property changes on: trunk/gforge_base/evolvisforge/gforge/utils/underworld-dummy/ssh_dump_2_5.pl
___________________________________________________________________
Name: svn:executable
+ *
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_cvs_history.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_cvs_history.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_cvs_history.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,120 +0,0 @@
-#!/usr/bin/perl
-##
-## db_cvs_history.pl
-##
-## NIGHTLY SCRIPT
-##
-## Pulls the parsed CVS datafile (generated by cvs_history_parse.pl ) from the
-## cvs server, and parses it into the database
-##
-## Written by Matthew Snelham <matthew at valinux.com>
-##
-#use strict; ## annoying include requirements
-use DBI;
-use Time::Local;
-use POSIX qw( strftime );
-require("/usr/lib/sourceforge/utils/include.pl"); # Include all the predefined functions
-&db_connect;
-
-my ($logfile, $sql, $res, $temp, %groups, $group_id, $errors );
-my $verbose = 1;
-
-##
-## Set begin and end times (in epoch seconds) of day to be run
-## Either specified on the command line, or auto-calculated
-## to run yesterday's data.
-##
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
-
- $day_begin = timegm( 0, 0, 0, $ARGV[2], $ARGV[1] - 1, $ARGV[0] - 1900 );
- $day_end = timegm( 0, 0, 0, (gmtime( $day_begin + 86400 ))[3,4,5] );
-
-} else {
-
- ## Start at midnight last night.
- $day_end = timegm( 0, 0, 0, (gmtime( time() ))[3,4,5] );
- ## go until midnight yesterday.
- $day_begin = timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] );
-
-}
-
- ## Preformat the important date strings.
-$year = strftime("%Y", gmtime( $day_begin ) );
-$mon = strftime("%m", gmtime( $day_begin ) );
-$week = strftime("%U", gmtime( $day_begin ) ); ## GNU ext.
-$day = strftime("%d", gmtime( $day_begin ) );
-print "Running week $week, day $day month $mon year $year \n" if $verbose;
-
-
- ## We'll pull down the parsed CVS log from the CVS server via http?! <sigh>
-print "Pulling down preprocessed logfile from cvs1...\n" if $verbose;
-$logfile = "/tmp/cvs_history.txt";
-unlink("$logfile");
-`wget -q -O $logfile http://cvs1/cvslogs/$year/$mon/cvs_traffic_$year$mon$day.log`;
-print `ls -la $logfile`;
-
- ## Now, we will pull all of the project ID's and names into a *massive*
- ## hash, because it will save us some real time in the log processing.
-print "Caching group information from groups table.\n" if $verbose;
-$sql = "SELECT group_id,unix_group_name FROM groups";
-$res = $dbh->prepare($sql);
-$res->execute();
-while ( $temp = $res->fetchrow_arrayref() ) {
- $groups{${$temp}[1]} = ${$temp}[0];
-}
-##
-## wrap this process in a transaction
-##
-$dbh->do( "BEGIN WORK;" );
-
- ## begin parsing the log file line by line.
-print "Parsing the information into the database..." if $verbose;
-open( LOGFILE, $logfile ) or die "Cannot open /tmp/boa_stats.txt";
-while(<LOGFILE>) {
-
- if ( $_ =~ /^G::/ ) {
- chomp($_);
-
- ## (G|U|E)::proj_name::user_name::checkouts::commits::adds
- my ($type, $group, $user, $checkouts, $commits, $adds) = split( /::/, $_, 6 );
-
- $group_id = $groups{$group};
-
- if ( $group_id == 0 ) {
- print STDERR "$_";
- print STDERR "db_cvs_history.pl: bad unix_group_name \'$name\' \n";
- }
-
- $sql = "INSERT INTO stats_project_build_tmp
- (group_id,stat,value)
- VALUES ('" . $group_id . "',"
- . "'cvs_checkouts','" . $checkouts . "')";
- $dbh->do( $sql );
- $sql = "INSERT INTO stats_project_build_tmp
- (group_id,stat,value)
- VALUES ('" . $group_id . "',"
- . "'cvs_commits','" . $commits . "')";
- $dbh->do( $sql );
-
- $sql = "INSERT INTO stats_project_build_tmp
- (group_id,stat,value)
- VALUES ('" . $group_id . "',"
- . "'cvs_adds','" . $adds . "')";
- $dbh->do( $sql );
-
- } elsif ( $_ =~ /^E::/ ) {
- $errors++;
- }
-
-}
-close( LOGFILE );
-##
-## wrap this process in a transaction
-##
-$dbh->do( "COMMIT WORK;" );
-
-print " done.\n" if $verbose;
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_prepare.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_prepare.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_prepare.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,32 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_stats_prepare.pl,v 1.1 2000/08/21 17:08:51 msnelham Exp $
-#
-use DBI;
-
-my $verbose = 1;
-
-require("/usr/lib/sourceforge/utils/include.pl");
-$dbh = &db_connect();
-
-##
-## Drop the tmp table.
-##
-$sql = "DROP TABLE IF EXISTS stats_project_build_tmp";
-$rel = $dbh->prepare($sql)->execute();
-print "Dropped stats_project_build_tmp...\n" if $verbose;
-
-##
-## Create a temporary table to hold all of our stats
-## for agregation at the end.
-##
-$sql = "CREATE TABLE stats_project_build_tmp (
- group_id int NOT NULL,
- stat char(14) NOT NULL,
- value int NOT NULL DEFAULT '0',
- KEY idx_archive_build_group (group_id),
- KEY idx_archive_build_stat (stat)
- )";
-$rel = $dbh->prepare($sql)->execute();
-print "Created stats_project_build_tmp...\n" if $verbose;
-
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_projects_nightly.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_projects_nightly.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_projects_nightly.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,356 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_stats_projects_nightly.pl,v 1.7 2000/11/03 21:54:51 pgport Exp $
-#
-# use strict;
-use DBI;
-use Time::Local;
-use POSIX qw( strftime );
-
-require("/usr/lib/sourceforge/utils/include.pl");
-$dbh = &db_connect();
-
-my ($sql, $rel);
-my ($day_begin, $day_end, $mday, $year, $mon, $week, $day);
-my $verbose = 1;
-
-##
-## Set begin and end times (in epoch seconds) of day to be run
-## Either specified on the command line, or auto-calculated
-## to run yesterday's data.
-##
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
-
- $day_begin = timegm( 0, 0, 0, $ARGV[2], $ARGV[1] - 1, $ARGV[0] - 1900 );
- $day_end = timegm( 0, 0, 0, (gmtime( $day_begin + 86400 ))[3,4,5] );
-
-} else {
-
- ## Start at midnight last night.
- $day_end = timegm( 0, 0, 0, (gmtime( time() ))[3,4,5] );
- ## go until midnight yesterday.
- $day_begin = timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] );
-
-}
-
- ## Preformat the important date strings.
-$year = strftime("%Y", gmtime( $day_begin ) );
-$mon = strftime("%m", gmtime( $day_begin ) );
-$week = strftime("%U", gmtime( $day_begin ) ); ## GNU ext.
-$day = strftime("%d", gmtime( $day_begin ) );
-print "Running week $week, day $day month $mon year $year \n" if $verbose;
-
-
-
-##
-## Now we're going to pull in every column...
-##
-
-## group_ranking
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'group_ranking',ranking
- FROM project_metric";
-$rel = $dbh->prepare($sql)->execute();
-print "Inserted group_ranking from project_metric...\n" if $verbose;
-
-## group_metric
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'group_metric',percentile
- FROM project_metric";
-$rel = $dbh->prepare($sql)->execute();
-print "Inserted percentile from project_metric...\n" if $verbose;
-
-## developers
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'developers',COUNT(user_id)
- FROM user_group
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Inserted developers from user_group...\n" if $verbose;
-
-## file_releases
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'file_releases',COUNT(release_id)
- FROM frs_release,frs_package
- WHERE ( frs_release.release_date > $day_begin AND frs_release.release_date < $day_end
- AND frs_release.package_id = frs_package.package_id )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert file_releases from frs_release,frs_package...\n" if $verbose;
-
-## downloads
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'downloads',downloads
- FROM frs_dlstats_group_agg
- WHERE ( day = '$year$mon$day' )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert downloads from frs_dlstats_group_agg...\n" if $verbose;
-
-## site_views
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'site_views',count
- FROM stats_agg_logo_by_group
- WHERE ( day = '$year$mon$day' )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert site_views from activity_log...\n" if $verbose;
-
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
- ## register_time
- $sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'register_time',register_time
- FROM groups
- GROUP BY group_id";
- $rel = $dbh->prepare($sql)->execute();
- print "Insert register_time from groups...\n" if $verbose;
-
-} else {
- ## site_views
- $sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'site_views',COUNT(group_id)
- FROM activity_log_old
- WHERE ( day = '$year$mon$day' AND type = 0 )
- GROUP BY group_id";
- $rel = $dbh->prepare($sql)->execute();
- print "Insert site_views from activity_log...\n" if $verbose;
-}
-
-print "Postponed: subdomain_views need to be inserted later from the project server logs...\n" if $verbose;
-
-## msg_posted
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT forum_group_list.group_id,'msg_posted',COUNT(forum.msg_id)
- FROM forum_group_list, forum
- WHERE ( forum_group_list.group_forum_id = forum.group_forum_id
- AND forum.date > $day_begin AND forum.date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert msg_posted from forum_group_list and forum...\n" if $verbose;
-
-## msg_uniq_auth
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT forum_group_list.group_id,'msg_uniq_auth',COUNT( DISTINCT(forum.posted_by) )
- FROM forum_group_list, forum
- WHERE ( forum_group_list.group_forum_id = forum.group_forum_id
- AND forum.date > $day_begin AND forum.date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert msg_uniq_auth from forum_group_list and forum...\n" if $verbose;
-
-## bugs_opened
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'bugs_opened',COUNT(bug_id)
- FROM bug
- WHERE ( date > $day_begin AND date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert bugs_opened from bug...\n" if $verbose;
-
-## bugs_closed
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'bugs_closed',COUNT(bug_id)
- FROM bug
- WHERE ( close_date > $day_begin AND close_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert bugs_closed from bug...\n" if $verbose;
-
-## support_opened
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'support_opened',COUNT(support_id)
- FROM support
- WHERE ( open_date > $day_begin AND open_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert support_opened from support...\n" if $verbose;
-
-## support_closed
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'support_closed',COUNT(support_id)
- FROM support
- WHERE ( close_date > $day_begin AND close_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert support_closed from support...\n" if $verbose;
-
-## patches_opened
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'patches_opened',COUNT(patch_id)
- FROM patch
- WHERE ( open_date > $day_begin AND open_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert patches_opened from patch...\n" if $verbose;
-
-## patches_closed
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'patches_closed',COUNT(patch_id)
- FROM patch
- WHERE ( close_date > $day_begin AND close_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert patches_closed from patch...\n" if $verbose;
-
-## tasks_opened
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_project_id as group_id,'tasks_opened',
- COUNT(project_task_id)
- FROM project_task
- WHERE ( start_date > $day_begin AND start_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert tasks_opened from project_task...\n" if $verbose;
-
-## tasks_closed
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_project_id as group_id,'tasks_closed',
- COUNT(project_task_id)
- FROM project_task
- WHERE ( end_date > $day_begin AND end_date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert tasks_closed from project_task...\n" if $verbose;
-
-## help_requests
-$sql = "INSERT INTO stats_project_build_tmp
- SELECT group_id,'help_requests',
- COUNT(job_id)
- FROM people_job
- WHERE ( date > $day_begin AND date < $day_end )
- GROUP BY group_id";
-$rel = $dbh->prepare($sql)->execute();
-print "Insert help_requests from people_job...\n" if $verbose;
-
-##
-## Create the daily tmp table for the update.
-##
-$sql="DROP TABLE IF EXISTS stats_project_tmp";
-$rel = $dbh->prepare($sql)->execute();
-print "Dropping stats_project_tmp in preparation...\n" if $verbose;
-
-$sql = "CREATE TABLE stats_project_tmp (
- month int(11) DEFAULT '0' NOT NULL,
- week int(11) DEFAULT '0' NOT NULL,
- day int(11) DEFAULT '0' NOT NULL,
- group_id int(11) DEFAULT '0' NOT NULL,
- group_ranking int(11) DEFAULT '0' NOT NULL,
- group_metric float(8,5) DEFAULT '0' NOT NULL,
- developers smallint(6) DEFAULT '0' NOT NULL,
- file_releases smallint(6) DEFAULT '0' NOT NULL,
- downloads int(11) DEFAULT '0' NOT NULL,
- site_views int(11) DEFAULT '0' NOT NULL,
- subdomain_views int(11) DEFAULT '0' NOT NULL,
- msg_posted smallint(6) DEFAULT '0' NOT NULL,
- msg_uniq_auth smallint(6) DEFAULT '0' NOT NULL,
- bugs_opened smallint(6) DEFAULT '0' NOT NULL,
- bugs_closed smallint(6) DEFAULT '0' NOT NULL,
- support_opened smallint(6) DEFAULT '0' NOT NULL,
- support_closed smallint(6) DEFAULT '0' NOT NULL,
- patches_opened smallint(6) DEFAULT '0' NOT NULL,
- patches_closed smallint(6) DEFAULT '0' NOT NULL,
- tasks_opened smallint(6) DEFAULT '0' NOT NULL,
- tasks_closed smallint(6) DEFAULT '0' NOT NULL,
- help_requests smallint(6) DEFAULT '0' NOT NULL,
- cvs_checkouts smallint(6) DEFAULT '0' NOT NULL,
- cvs_commits smallint(6) DEFAULT '0' NOT NULL,
- cvs_adds smallint(6) DEFAULT '0' NOT NULL,
- KEY idx_project_log_group (group_id)
-)";
-$rel = $dbh->prepare($sql)->execute();
-print "Created stats_project_tmp for agregation...\n" if $verbose;
-
-##
-## Populate the stats_archive_project_tmp table the old
-## fashioned way. (It's cleaner/faster than making the 3! tmp tables
-## needed to merge the stats_project_build_tmp into the
-## stats_archive_project_tmp with MySQL.. if you can
-## believe that.)
-##
-
-my (%stat_data, $group_id, $column, $value, @ar);
-
-$sql = "SELECT DISTINCT group_id FROM stats_project_build_tmp";
-$rel = $dbh->prepare($sql);
-$rel->execute() or die "db_archive_stats_update.pl: Failed to run agregates.\n";
-
-while ( @ar = $rel->fetchrow_array ) {
- $group_id = $ar[0];
- $stat_data{$group_id} = {};
- $stat_data{$group_id}{"month"} = "$year$mon";
- $stat_data{$group_id}{"week"} = $week;
- $stat_data{$group_id}{"day"} = $day;
-}
-print "Begining collation of " . $rel->rows . " project records..." if $verbose;
-$rel->finish();
-
-
-foreach $group_id ( keys %stat_data ) {
-
- $sql = "SELECT * FROM stats_project_build_tmp WHERE group_id=$group_id";
- $rel = $dbh->prepare($sql);
- $rel->execute();
- while ( ($column, $value) = ($rel->fetchrow_array)[1,2] ) {
- $stat_data{$group_id}{$column} = $value;
- }
- $rel->finish();
-
- if ( $stat_data{$group_id}{"register_time"} < $day_end ) {
-
- delete $stat_data{$group_id}{"register_time"};
- $sql = "INSERT INTO stats_project_tmp SET ";
- $sql .= "group_id=$group_id,";
- $sql .= join( ",",
- map { "$_\=\'$stat_data{$group_id}{$_}\'" } (keys %{$stat_data{$group_id}})
- );
- $rel = $dbh->prepare($sql);
- $rel->execute();
- }
-}
-print "Finished.\n" if $verbose;
-
-
-##
-## Drop the tmp table.
-##
-$sql = "DROP TABLE IF EXISTS stats_project_build_tmp";
-$rel = $dbh->prepare($sql)->execute();
-print "Dropped stats_project_build_tmp...\n" if $verbose;
-
-
-##
-## Build the rest of the indexes on the temp table before we merge
-## back into the live table. (to reduce locking time on live table)
-##
-
-$sql = "CREATE INDEX idx_project_stats_day
- on stats_project_tmp(day)";
-$rel = $dbh->prepare($sql)->execute();
-
-$sql = "CREATE INDEX idx_project_stats_week
- on stats_project_tmp(week)";
-$rel = $dbh->prepare($sql)->execute();
-
-$sql = "CREATE INDEX idx_project_stats_month
- on stats_project_tmp(month)";
-$rel = $dbh->prepare($sql)->execute();
-print "Added further indexes to stats_project_tmp...\n" if $verbose;
-
-##
-## Merge tmp table back into the live stat table
-##
-$sql = "DELETE FROM stats_project WHERE month='$year$mon' AND day='$day'";
-$rel = $dbh->prepare($sql)->execute();
-print "Cleared Old data from stats_project...\n" if $verbose;
-
-$sql = "INSERT INTO stats_project
- SELECT * FROM stats_project_tmp";
-$rel = $dbh->prepare($sql)->execute();
-print "Wrote back new data to stats_project...\n" if $verbose;
-
-print "Done.\n" if $verbose;
-exit;
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_site_nightly.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_site_nightly.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/db_stats_site_nightly.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,111 +0,0 @@
-#!/usr/bin/perl
-#
-# $Id: db_stats_site_nightly.pl,v 1.8 2000/10/11 19:55:39 tperdue Exp $
-#
-
-use DBI;
-use Time::Local;
-use POSIX qw( strftime );
-
-require("/usr/lib/sourceforge/utils/include.pl");
-$dbh = &db_connect();
-
-my ($sql, $rel, $day_begin, $day_end, $mon, $week, $day);
-my $verbose = 1;
-
-
-if ( $ARGV[0] && $ARGV[1] && $ARGV[2] ) {
-
- $day_begin = timegm( 0, 0, 0, $ARGV[2], $ARGV[1] - 1, $ARGV[0] - 1900 );
- $day_end = timegm( 0, 0, 0, (gmtime( $day_begin + 86400 ))[3,4,5] );
-
-} else {
-
- ## Start at midnight last night.
- $day_end = timegm( 0, 0, 0, (gmtime( time() ))[3,4,5] );
- ## go until midnight yesterday.
- $day_begin = timegm( 0, 0, 0, (gmtime( time() - 86400 ))[3,4,5] );
-
-}
-
- ## Preformat the important date strings.
-$year = strftime("%Y", gmtime( $day_begin ) );
-$mon = strftime("%Y%m", gmtime( $day_begin ) );
-$month = strftime("%m", gmtime( $day_begin ) );
-$week = strftime("%U", gmtime( $day_begin ) ); ## GNU ext.
-$day = strftime("%d", gmtime( $day_begin ) );
-print "Running week $week, day $day month $month year $year ($mon)\n" if $verbose;
-
-
-##
-## And now, we calculate the agregate stats for the site.
-##
-
-## site_views
-##
-
-$sql = "SELECT count FROM stats_agg_pages_by_day WHERE (day='$mon$day')";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($site_views) = ($rel->fetchrow_array)[0];
-
-## subdomain_views
-##
-$sql = "SELECT SUM(subdomain_views) FROM stats_project WHERE ( month='$mon' AND day='$day' ) GROUP BY month,day";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($subdomain_views) = ($rel->fetchrow_array)[0];
-
-## downloads
-##
-$sql = "SELECT SUM(downloads) FROM frs_dlstats_group_agg WHERE ( day='$mon$day' ) GROUP BY day";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($downloads) = ($rel->fetchrow_array)[0];
-
-## uniq_users
-##
-$sql = "SELECT COUNT(DISTINCT(user_id)) FROM session WHERE (time < $day_end AND time > $day_begin)";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($uniq_users) = ($rel->fetchrow_array)[0];
-
-## sessions
-##
-$sql = "SELECT COUNT(session_hash) FROM session WHERE (time < $day_end AND time > $day_begin)";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($sessions) = ($rel->fetchrow_array)[0];
-
-## total_users
-##
-$sql = "SELECT COUNT(user_id) FROM users WHERE ( add_date < $day_end AND status='A' )";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($total_users) = ($rel->fetchrow_array)[0];
-
-## new_users
-##
-$sql = "SELECT COUNT(user_id) FROM users WHERE ( add_date < $day_end AND add_date > $day_begin )";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($new_users) = ($rel->fetchrow_array)[0];
-
-## new_projects
-##
-$sql = "SELECT COUNT(group_id) FROM groups WHERE ( register_time < $day_end AND register_time > $day_begin )";
-($rel = $dbh->prepare($sql))->execute() || die "SQL error: $!";
-($new_projects) = ($rel->fetchrow_array)[0];
-
-
-
-##
-## Merge the nightly site info back into the live stat table.
-##
-
-$sql = "DELETE FROM stats_site WHERE (month='$mon' AND day='$day')";
-$rel = $dbh->do($sql) || die("SQL error: $!");
-
-$sql = "INSERT INTO stats_site VALUES "
- . "('$mon','$week','$day',"
- . "'$site_views','$subdomain_views','$downloads','$uniq_users',"
- . "'$sessions','$total_users','$new_users','$new_projects')";
-$rel = $dbh->do($sql) || die("SQL error: $!");
-print "Wrote back new data to stats_site...\n" if $verbose;
-
-##
-## EOF
-##
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/run_span.pl
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/run_span.pl 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/run_span.pl 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,23 +0,0 @@
-#!/usr/bin/perl
-use Time::Local;
-
-$script = "db_stats_site_nightly.pl";
-$span = $ARGV[0];
-$year = $ARGV[1];
-$month = $ARGV[2];
-$day = $ARGV[3];
-
-$| = 0;
-print "Processing $span day span from $month/$day/$year ...\n";
-
-for ( $i = 1; $i <= $span; $i++ ) {
-
- $command = "perl $script $year $month $day";
- print STDERR "Running \'$command\' from the current directory...\n";
- print STDERR `$command`;
-
- ($year,$month,$day) = (gmtime( timegm(0,0,0,$day + 1,$month - 1,$year - 1900) ))[5,4,3];
- $year += 1900;
- $month += 1;
-}
-
Deleted: trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/stats_nightly.sh
===================================================================
--- trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/stats_nightly.sh 2010-02-25 13:21:48 UTC (rev 409)
+++ trunk/gforge_base/evolvisforge/gforge/utils/underworld-root/stats_nightly.sh 2010-02-25 13:21:51 UTC (rev 410)
@@ -1,13 +0,0 @@
-#!/bin/sh
-
-## The order these scripts are run in is CRITICAL
-## DO NOT change their order. Add before, or add after
-##
-/usr/lib/sourceforge/bin/db_stats_prepare.pl $*
-# /usr/lib/sourceforge/bin/db_stats_cvs_history.pl $*
-/usr/lib/sourceforge/bin/db_stats_projects_nightly.pl $*
-##
-## END order sensitive section
-##
-
-/usr/lib/sourceforge/bin/db_stats_site_nightly.pl $*
More information about the evolvis-commits
mailing list