Next Article in Journal
Crowdsourcing of Popular Toponyms: How to Collect and Preserve Toponyms in Spoken Use
Previous Article in Journal
On the Representativeness of OpenStreetMap for the Evaluation of Country Tourism Competitiveness
 
 
Article
Peer-Review Record

A Progressive and Combined Building Simplification Approach with Local Structure Classification and Backtracking Strategy

ISPRS Int. J. Geo-Inf. 2021, 10(5), 302; https://doi.org/10.3390/ijgi10050302
by Zhiwei Wei 1,2,*, Yang Liu 3, Lu Cheng 3 and Su Ding 4
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
ISPRS Int. J. Geo-Inf. 2021, 10(5), 302; https://doi.org/10.3390/ijgi10050302
Submission received: 4 March 2021 / Revised: 28 April 2021 / Accepted: 2 May 2021 / Published: 5 May 2021

Round 1

Reviewer 1 Report

The paper is well written and describes the introduced method very well. However, it lacks novelty. The agent based method of the EU-project AGENT already used backtracking to determine optimal local operations. There are two aspects which I find interesting:

  • using successful generalizations as template
  • you determine "all" intermediate scales, which lends itself very nicely to a continuous generalization scheme. This is, however, a potential drawback of this might be - and this is not clear from the presentation - is to generate a specific target scale: on lines 570 pp you write that for a target scale 1:25.000 you select representations between adjacent scales, e.g. 19.000 and 42.000. This means that you all accept rather coarse representation as belonging to the given target scale.

here are some more comments/questions:

  • is it possible to generate a precise target scale (as most of the other approaches do)?
  • on p 17pp you describe that you delete representations, which do not fit into the sequence: could you also include this as a factor into the generation of the representation, namely that the chosen local operation should lead to the next coarser scale?
  • your approach does not take area preservation explicitly into account - this is also clearly shown in your measures, as there your approach does not perform as good as the others. Could you include this into your approach as well (as others methods do)?
  • you always keep all buildings - why do you not eliminate small ones?
  • i like your approach of using other successful generalizations as templates:
    • how often was it applied in the process?
    • how do you select the most appropriate template?

Author Response

Dear Editors and reviewers:

       Thank you for your letter and comments concerning our manuscript entitled “A progressive and combined building simplification approach with local structure classification and backtracking strategy” (ijgi-1151779). We have revised our paper as the guidelines suggested. We hope it may meet with approval. Details for main corrections see our attachment named “response”.

       Once again, many thanks for comments and suggestions from the editor and the reviewers.

Kind regards,

The authors

Author Response File: Author Response.docx

Reviewer 2 Report

Dear Authors,

thank you for interesting paper. I enjoyed reading it. Below, please find some suggestions and requests for further clarifications aiming at improving paper quality and comprehensiveness.

  1. In Figure 2 there are Rules for violation detection marked. Where exactly you get those rules from? And in what kind of form they exist? It would be helpful to add some additional explanation on generalization rules/map generalization guidelines to the paper.
  2. In step 3 of your framework you write about the verification of the result, namely if the satisfactory result was obtained. However, it is not clear to me how do you verify this 'satisfaction'?
  3. In section 3.3 you state that the backtracking strategy is applied in case an unsatisfactory results. How do you measure, specify the unsatisfactory result at this stage?
  4. In the section 3.3.3 some Thresholds/parameters appear. Where do you get them from? How do you specify them?
  5.  The results look promising indeed (Figure 7 and 8). However, from my experience of large scale generalization, at 1:50k apart from simplification there should be other generalization operations applied, like selection and aggregation. I understand that you are conducting only the experiments related to simplification and you name this as your approach constraint in the Limitations section, but including them, would change your results a lot. Actually, for 1:50k there usually appear some build-up areas on the paper map. I believe it would be good to state this change also somewhere earlier in the paper than the Limitations section.
  6. I appreciate that you compare your results to other algorithms (ESRI solution among others), however, I would consider also the comparison of the building shapes to well-designed paper maps at target scales. This would supply additional, important verification.
  7. You note that the user requirements were taken into account. What do you mean by this statement? In what aspect?
  8. Please improve the scale bar in all maps. The basis of the scale bar should be a round value, like 100, 200 or 500 meters. For instance you can describe 0, 500 and 1000 meters and leave the marks and descriptions only for these values on the graphical scale. This is more appropriate from cartographic and graphic point of view.

Kind regards

Author Response

Dear Editors and reviewers:

       Thank you for your letter and comments concerning our manuscript entitled “A progressive and combined building simplification approach with local structure classification and backtracking strategy” (ijgi-1151779). We have revised our paper as the guidelines suggested. We hope it may meet with approval. Details for main corrections see our attachment named “response”.

       Once again, many thanks for comments and suggestions from the editor and the reviewers.

Kind regards,

The authors

Author Response File: Author Response.docx

Reviewer 3 Report

The authors present a method for building simplification based on several algorithms and a constraint/backtracking strategy. The study has many interesting results but it (as well as the presentation) need to be improved.

If I get it right you are using a strategy where you use previous generalized building in the next step. This is of course good if you intend to use continuous generalization (as noted in the discussion). But you should also not that there are drawbacks with this approach. E.g. in figure 9 there are some buildings that change their main direction in scale 1:25,000. What happens if you then apply a template matching to 1:50,000 on these buildings? In this case it would have been better to start the template matching from the original building (in 10,000).

Figure 2 is good, but still it is a bit hard to follow the methodology section. I would suggest that you use the steps in the headline to help the reader, e.g. “3.3. Step 3: LS …”. It is e.g. not straight forward to understand how Figure 5 relates to the overall framework in Figure 2.

In some places there would have been good if you illustrate your concepts in figures. Examples are: (1) first paragraph in 3.1, (2) first paragraph in 3.3.1

In abstract you state that the results satisfy cartographic requirements. This is a too strong statement, you have e.g. not set the building in relation to other themes. What you have shown is that your method provides good result in terms of the constraints you have defined.

In the discussion you write that a further work could be to add removal and aggregation operator. I totally agree on this (and would also like to add typification). This is clearly seen in Figure 8. In my view this generalization result is really bad. If you add other themes to the map (road etc.) you will find that the map will be unreadable (which you need to comment in the paper). This result is not surprising. The way your method is built would manage a generalization of about decreasing the scale to half, not more. This does not mean that your method is bad, but that is has more limitations than you currently state in the paper.

In the discussion you compare with other algorithms. Such a comparison is good. However, you should notice that the comparison is made (partly) on the constraints you are using to govern your algorithm. This fact must be noticed, since it biases the comparison. Therefore, I also think it is a too strong statement in the conclusion that your method has higher cartographic quality than the other methods. This is not shown in an objective way.

The author need to provide the code in GitHub or similar. This should be stated in start of section 4.

 

Detailed comments:

  • Miss the word “Several” in start of abstract.
  • Page 3 top – write Legibility/Preservation constraints in bold and let them be left justified.
  • Page 3: You need to define orientation and position constraints here (or elsewhere). Otherwise it is e.g. difficult to interpret figure 9.
  • Last words in section 2. “may be”->”are”
  • In section 3.1. What is a “sharp node”?
  • 2 first row “will”->”that”
  • In equation 1 and 2. Is this dependent on the scale levels M? This needs a bit more explanation.
  • In 3.3.1 second row. In AE part, write e(pi, pi+1)…to show that it is consecutive points that build up the edges. Furthermore, I do not get the last clause here “ in which …”. What does that imply?
  • Figure 3 caption. Explain the dashed lines.
  • Table 3. The text should be left justified (using indentations where applicable)
  • Figure 6. Should you keep the form in the enlargement?
  • Section 5.3. Which National Administration of Surveying?

 

 

Author Response

Dear Editors and reviewers:

       Thank you for your letter and comments concerning our manuscript entitled “A progressive and combined building simplification approach with local structure classification and backtracking strategy” (ijgi-1151779). We have revised our paper as the guidelines suggested. We hope it may meet with approval. Details for main corrections see our attachment named “response”.

       Once again, many thanks for comments and suggestions from the editor and the reviewers.

Kind regards,

The authors

Author Response File: Author Response.docx

Round 2

Reviewer 3 Report

The paper has been improved from the first version. It is now possible to follow how the methodology work and also the scripts/data are provided. The methodology works fine for generalisation to scale 1:25 000. For this part the paper is, in my view, ready for publication.

However, when the author continues to use the same methodology for 1:50 000, the methods fail. The main reason is that it does not include any removal/aggregation/typification of buildings, or creation of built up areas. From a cartographic perspective the result in Figure 9 is not acceptable when it comes to the smaller building (all generalised with the BE method) but it works fine with the larger building (generalised with e.g. the LS method). That is, the methodology presented is not capable of handling minor buildings in the 1:50 000 scale (that requires removal/aggregation/typification). When the author evaluate their method they compare with other simplification algorithms, that has the same shortcoming, so this evaluation is not useful. The evaluation has to be based on a comparison with a properly generalised map in 1:50 000.

The author claim that they are using an area preserving constraint in the generalisation. Guess this is implemented for LS and perhaps TS, but not for the BE algorithm (even though the author describe the size constraint already on page 3); this is also how I interpret the workflow in Figure 2. Otherwise this constraints would not have approved the great enlargement of the buildings in figure 9. Furthermore, it is difficult to find out the size of these enlargement since these are not illustrated properly in figure 10. In the top right graph (in figure 10) there is a change in are of max 0.3, but the changes in figure 9 is surely larger. Is this the reason that there are no points visible for the building with the low id:s in this graph in figure 10? 

My suggestion is that the author completely skip the 1:50 000 generalisation in this paper. The method presented for 1:25 000 is interesting, and can be published as it is. Then 1:50 000 could be a future work where they have to include new operators/algorithms as stated above.

 

  

Author Response

Dear reviewer:

       Thank you for your letter and comments concerning our manuscript entitled “A progressive and combined building simplification approach with local structure classification and backtracking strategy” (ijgi-1151779). We have deleted the 1:50 000 generalisation in this manuscript as suggested. And all contents in manuscript related to the 1:50 000 generalisation are also revised. We hope it may meet with approval. The main corrections have been highlighted with underline in manuscript.

Once again, many thanks for comments and suggestions from the editor and the reviewers.

 

Kind regards,

The authors

Author Response File: Author Response.docx

Round 3

Reviewer 3 Report

Thanks for removing the 1:50K at this stage. My recommendation is now that the paper can be published. 

Back to TopTop